Jan 28 06:50:27 crc systemd[1]: Starting Kubernetes Kubelet... Jan 28 06:50:27 crc restorecon[4691]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 06:50:28 crc restorecon[4691]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 06:50:28 crc restorecon[4691]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 28 06:50:29 crc kubenswrapper[4776]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 06:50:29 crc kubenswrapper[4776]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 28 06:50:29 crc kubenswrapper[4776]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 06:50:29 crc kubenswrapper[4776]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 06:50:29 crc kubenswrapper[4776]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 28 06:50:29 crc kubenswrapper[4776]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.041882 4776 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054538 4776 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054611 4776 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054618 4776 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054623 4776 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054629 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054635 4776 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054640 4776 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054645 4776 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054651 4776 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054658 4776 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054667 4776 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054672 4776 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054677 4776 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054683 4776 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054687 4776 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054692 4776 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054697 4776 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054702 4776 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054707 4776 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054713 4776 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054719 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054724 4776 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054729 4776 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054735 4776 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054740 4776 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054745 4776 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054750 4776 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054754 4776 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054771 4776 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054776 4776 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054780 4776 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054785 4776 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054789 4776 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054794 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054799 4776 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054803 4776 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054807 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054812 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054816 4776 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054821 4776 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054825 4776 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054832 4776 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054836 4776 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054840 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054846 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054852 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054856 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054860 4776 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054865 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054869 4776 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054874 4776 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054878 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054883 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054887 4776 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054893 4776 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054899 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054903 4776 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054908 4776 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054913 4776 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054919 4776 feature_gate.go:330] unrecognized feature gate: Example Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054924 4776 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054928 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054934 4776 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054939 4776 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054943 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054948 4776 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054955 4776 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054961 4776 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054965 4776 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054969 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.054974 4776 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.055971 4776 flags.go:64] FLAG: --address="0.0.0.0" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.055999 4776 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056012 4776 flags.go:64] FLAG: --anonymous-auth="true" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056021 4776 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056031 4776 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056037 4776 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056047 4776 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056055 4776 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056061 4776 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056067 4776 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056074 4776 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056081 4776 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056087 4776 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056094 4776 flags.go:64] FLAG: --cgroup-root="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056099 4776 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056105 4776 flags.go:64] FLAG: --client-ca-file="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056111 4776 flags.go:64] FLAG: --cloud-config="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056116 4776 flags.go:64] FLAG: --cloud-provider="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056122 4776 flags.go:64] FLAG: --cluster-dns="[]" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056128 4776 flags.go:64] FLAG: --cluster-domain="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056134 4776 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056139 4776 flags.go:64] FLAG: --config-dir="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056144 4776 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056150 4776 flags.go:64] FLAG: --container-log-max-files="5" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056158 4776 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056164 4776 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056170 4776 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056176 4776 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056182 4776 flags.go:64] FLAG: --contention-profiling="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056187 4776 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056196 4776 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056202 4776 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056207 4776 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056217 4776 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056223 4776 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056229 4776 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056235 4776 flags.go:64] FLAG: --enable-load-reader="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056241 4776 flags.go:64] FLAG: --enable-server="true" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056246 4776 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056254 4776 flags.go:64] FLAG: --event-burst="100" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056260 4776 flags.go:64] FLAG: --event-qps="50" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056266 4776 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056271 4776 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056276 4776 flags.go:64] FLAG: --eviction-hard="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056290 4776 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056295 4776 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056300 4776 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056307 4776 flags.go:64] FLAG: --eviction-soft="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056312 4776 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056317 4776 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056322 4776 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056328 4776 flags.go:64] FLAG: --experimental-mounter-path="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056334 4776 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056339 4776 flags.go:64] FLAG: --fail-swap-on="true" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056345 4776 flags.go:64] FLAG: --feature-gates="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056353 4776 flags.go:64] FLAG: --file-check-frequency="20s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056358 4776 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056364 4776 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056370 4776 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056375 4776 flags.go:64] FLAG: --healthz-port="10248" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056382 4776 flags.go:64] FLAG: --help="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056388 4776 flags.go:64] FLAG: --hostname-override="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056394 4776 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056399 4776 flags.go:64] FLAG: --http-check-frequency="20s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056404 4776 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056410 4776 flags.go:64] FLAG: --image-credential-provider-config="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056415 4776 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056421 4776 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056426 4776 flags.go:64] FLAG: --image-service-endpoint="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056431 4776 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056436 4776 flags.go:64] FLAG: --kube-api-burst="100" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056442 4776 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056448 4776 flags.go:64] FLAG: --kube-api-qps="50" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056452 4776 flags.go:64] FLAG: --kube-reserved="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056458 4776 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056463 4776 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056468 4776 flags.go:64] FLAG: --kubelet-cgroups="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056473 4776 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056479 4776 flags.go:64] FLAG: --lock-file="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056484 4776 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056489 4776 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056495 4776 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056504 4776 flags.go:64] FLAG: --log-json-split-stream="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056514 4776 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056519 4776 flags.go:64] FLAG: --log-text-split-stream="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056525 4776 flags.go:64] FLAG: --logging-format="text" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056530 4776 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056537 4776 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056590 4776 flags.go:64] FLAG: --manifest-url="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056598 4776 flags.go:64] FLAG: --manifest-url-header="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056607 4776 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056613 4776 flags.go:64] FLAG: --max-open-files="1000000" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056620 4776 flags.go:64] FLAG: --max-pods="110" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056627 4776 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056633 4776 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056639 4776 flags.go:64] FLAG: --memory-manager-policy="None" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056644 4776 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056650 4776 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056655 4776 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056661 4776 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056675 4776 flags.go:64] FLAG: --node-status-max-images="50" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056681 4776 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056687 4776 flags.go:64] FLAG: --oom-score-adj="-999" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056693 4776 flags.go:64] FLAG: --pod-cidr="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056698 4776 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056707 4776 flags.go:64] FLAG: --pod-manifest-path="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056712 4776 flags.go:64] FLAG: --pod-max-pids="-1" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056717 4776 flags.go:64] FLAG: --pods-per-core="0" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056722 4776 flags.go:64] FLAG: --port="10250" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056728 4776 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056733 4776 flags.go:64] FLAG: --provider-id="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056738 4776 flags.go:64] FLAG: --qos-reserved="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056744 4776 flags.go:64] FLAG: --read-only-port="10255" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056749 4776 flags.go:64] FLAG: --register-node="true" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056755 4776 flags.go:64] FLAG: --register-schedulable="true" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056760 4776 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056772 4776 flags.go:64] FLAG: --registry-burst="10" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056777 4776 flags.go:64] FLAG: --registry-qps="5" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056783 4776 flags.go:64] FLAG: --reserved-cpus="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056791 4776 flags.go:64] FLAG: --reserved-memory="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056799 4776 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056806 4776 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056811 4776 flags.go:64] FLAG: --rotate-certificates="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056817 4776 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056823 4776 flags.go:64] FLAG: --runonce="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056828 4776 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056834 4776 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056840 4776 flags.go:64] FLAG: --seccomp-default="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056845 4776 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056851 4776 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056856 4776 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056862 4776 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056867 4776 flags.go:64] FLAG: --storage-driver-password="root" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056873 4776 flags.go:64] FLAG: --storage-driver-secure="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056878 4776 flags.go:64] FLAG: --storage-driver-table="stats" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056884 4776 flags.go:64] FLAG: --storage-driver-user="root" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056889 4776 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056894 4776 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056900 4776 flags.go:64] FLAG: --system-cgroups="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056905 4776 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056914 4776 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056919 4776 flags.go:64] FLAG: --tls-cert-file="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056925 4776 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056931 4776 flags.go:64] FLAG: --tls-min-version="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056937 4776 flags.go:64] FLAG: --tls-private-key-file="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056942 4776 flags.go:64] FLAG: --topology-manager-policy="none" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056948 4776 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056953 4776 flags.go:64] FLAG: --topology-manager-scope="container" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056959 4776 flags.go:64] FLAG: --v="2" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056968 4776 flags.go:64] FLAG: --version="false" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056978 4776 flags.go:64] FLAG: --vmodule="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056985 4776 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.056992 4776 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057130 4776 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057138 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057144 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057149 4776 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057154 4776 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057158 4776 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057163 4776 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057167 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057172 4776 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057177 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057181 4776 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057185 4776 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057190 4776 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057194 4776 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057199 4776 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057203 4776 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057208 4776 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057212 4776 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057216 4776 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057221 4776 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057226 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057230 4776 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057235 4776 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057239 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057245 4776 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057250 4776 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057254 4776 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057259 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057264 4776 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057269 4776 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057280 4776 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057285 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057290 4776 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057294 4776 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057299 4776 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057303 4776 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057307 4776 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057314 4776 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057320 4776 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057324 4776 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057329 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057333 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057337 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057340 4776 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057345 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057348 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057353 4776 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057357 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057361 4776 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057366 4776 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057371 4776 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057377 4776 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057382 4776 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057387 4776 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057392 4776 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057397 4776 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057401 4776 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057406 4776 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057411 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057415 4776 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057421 4776 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057427 4776 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057436 4776 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057441 4776 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057447 4776 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057453 4776 feature_gate.go:330] unrecognized feature gate: Example Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057458 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057462 4776 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057467 4776 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057478 4776 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.057483 4776 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.058371 4776 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.072768 4776 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.072831 4776 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.072934 4776 feature_gate.go:330] unrecognized feature gate: Example Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.072948 4776 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.072955 4776 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.072962 4776 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.072968 4776 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.072975 4776 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.072982 4776 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.072988 4776 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.072994 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073000 4776 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073006 4776 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073012 4776 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073018 4776 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073025 4776 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073030 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073036 4776 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073042 4776 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073049 4776 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073059 4776 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073065 4776 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073072 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073079 4776 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073085 4776 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073092 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073099 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073105 4776 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073112 4776 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073118 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073124 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073132 4776 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073140 4776 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073148 4776 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073155 4776 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073165 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073172 4776 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073180 4776 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073187 4776 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073194 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073202 4776 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073209 4776 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073216 4776 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073223 4776 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073229 4776 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073235 4776 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073241 4776 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073247 4776 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073254 4776 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073262 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073269 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073275 4776 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073281 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073288 4776 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073295 4776 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073301 4776 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073307 4776 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073315 4776 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073323 4776 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073330 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073337 4776 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073343 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073349 4776 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073356 4776 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073362 4776 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073370 4776 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073377 4776 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073386 4776 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073393 4776 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073399 4776 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073405 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073411 4776 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073417 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.073429 4776 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073680 4776 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073692 4776 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073698 4776 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073705 4776 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073711 4776 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073717 4776 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073723 4776 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073730 4776 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073735 4776 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073741 4776 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073747 4776 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073753 4776 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073758 4776 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073764 4776 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073770 4776 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073776 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073782 4776 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073789 4776 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073794 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073800 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073806 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073812 4776 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073818 4776 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073824 4776 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073830 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073838 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073848 4776 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073855 4776 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073862 4776 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073872 4776 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073881 4776 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073889 4776 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073896 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073902 4776 feature_gate.go:330] unrecognized feature gate: Example Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073909 4776 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073917 4776 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073924 4776 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073930 4776 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073938 4776 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073945 4776 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073950 4776 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073956 4776 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073962 4776 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073967 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073974 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073979 4776 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073984 4776 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073992 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.073997 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074003 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074009 4776 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074015 4776 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074023 4776 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074030 4776 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074035 4776 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074041 4776 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074047 4776 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074053 4776 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074060 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074067 4776 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074073 4776 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074079 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074085 4776 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074092 4776 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074098 4776 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074106 4776 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074113 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074121 4776 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074127 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074133 4776 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.074141 4776 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.074150 4776 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.074428 4776 server.go:940] "Client rotation is on, will bootstrap in background" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.080289 4776 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.080414 4776 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.082440 4776 server.go:997] "Starting client certificate rotation" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.082490 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.083825 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-01 07:37:51.81860776 +0000 UTC Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.083934 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.110830 4776 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 28 06:50:29 crc kubenswrapper[4776]: E0128 06:50:29.113694 4776 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.113932 4776 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.130093 4776 log.go:25] "Validated CRI v1 runtime API" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.173439 4776 log.go:25] "Validated CRI v1 image API" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.176209 4776 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.183981 4776 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-28-06-44-59-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.184039 4776 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.214062 4776 manager.go:217] Machine: {Timestamp:2026-01-28 06:50:29.210379354 +0000 UTC m=+0.626039594 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:53a286a7-147d-439f-bf29-b3b09993325f BootID:32f27aa7-2aa2-417b-80de-993c2f103850 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:f5:c9:98 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:f5:c9:98 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:bf:45:dc Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:e5:67:45 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:d4:21:20 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:32:5b:56 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:9e:b5:0d:bf:7f:45 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:b6:c1:95:5c:7f:03 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.214485 4776 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.214765 4776 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.215302 4776 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.215670 4776 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.215733 4776 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.216103 4776 topology_manager.go:138] "Creating topology manager with none policy" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.216125 4776 container_manager_linux.go:303] "Creating device plugin manager" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.216886 4776 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.216935 4776 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.217801 4776 state_mem.go:36] "Initialized new in-memory state store" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.217923 4776 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.221930 4776 kubelet.go:418] "Attempting to sync node with API server" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.221958 4776 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.221981 4776 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.222000 4776 kubelet.go:324] "Adding apiserver pod source" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.222015 4776 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.229331 4776 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.231427 4776 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.231770 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Jan 28 06:50:29 crc kubenswrapper[4776]: E0128 06:50:29.231879 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.231996 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Jan 28 06:50:29 crc kubenswrapper[4776]: E0128 06:50:29.232120 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.233353 4776 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.235341 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.235436 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.235511 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.235602 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.235678 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.235750 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.235808 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.235877 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.235978 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.236065 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.236157 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.236217 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.237019 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.237780 4776 server.go:1280] "Started kubelet" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.238637 4776 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.238992 4776 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.239099 4776 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.239802 4776 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 28 06:50:29 crc systemd[1]: Started Kubernetes Kubelet. Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.242838 4776 server.go:460] "Adding debug handlers to kubelet server" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.249593 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.249667 4776 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.249788 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 06:15:53.934346836 +0000 UTC Jan 28 06:50:29 crc kubenswrapper[4776]: E0128 06:50:29.249899 4776 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.250483 4776 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.250503 4776 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 28 06:50:29 crc kubenswrapper[4776]: E0128 06:50:29.250635 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="200ms" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.250682 4776 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.251455 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Jan 28 06:50:29 crc kubenswrapper[4776]: E0128 06:50:29.251608 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.254148 4776 factory.go:55] Registering systemd factory Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.254182 4776 factory.go:221] Registration of the systemd container factory successfully Jan 28 06:50:29 crc kubenswrapper[4776]: E0128 06:50:29.256120 4776 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.195:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188ed263b2466457 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-28 06:50:29.237736535 +0000 UTC m=+0.653396695,LastTimestamp:2026-01-28 06:50:29.237736535 +0000 UTC m=+0.653396695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.264090 4776 factory.go:153] Registering CRI-O factory Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.264151 4776 factory.go:221] Registration of the crio container factory successfully Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.264339 4776 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.264400 4776 factory.go:103] Registering Raw factory Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.264453 4776 manager.go:1196] Started watching for new ooms in manager Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265626 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265694 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265713 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265728 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265742 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265756 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265770 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265784 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265802 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265817 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265830 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265844 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265856 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265872 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265894 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265908 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265914 4776 manager.go:319] Starting recovery of all containers Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.265921 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266592 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266640 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266659 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266700 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266719 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266732 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266745 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266762 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266778 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266810 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266826 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266840 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266854 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266869 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266884 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266900 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266916 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266936 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266952 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.266968 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267019 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267035 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267051 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267066 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267083 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267097 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267113 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267130 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267196 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267211 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267227 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267243 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267258 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267274 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267289 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267314 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267330 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267347 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267366 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267384 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267408 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267425 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267440 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267455 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267472 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267488 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267503 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267520 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267536 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267576 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267592 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267607 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267626 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267642 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267657 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267669 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267681 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267694 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267707 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267741 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267755 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267769 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267781 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267793 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267806 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267820 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267832 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267845 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267858 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267872 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267883 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267895 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267908 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267922 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267934 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267945 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267958 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267973 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.267985 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268000 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268025 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268045 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268059 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268081 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268097 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268112 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268123 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268140 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268153 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268164 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268176 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268188 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268199 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268213 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268240 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268257 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268273 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268286 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268298 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268310 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268320 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268332 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268342 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268352 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268363 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268373 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268384 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268395 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268406 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268416 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268427 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268437 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268447 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268465 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268475 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268485 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268497 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268506 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268515 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268525 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268537 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268576 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268601 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268616 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268627 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268638 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268650 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268660 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268671 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268696 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268733 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268744 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268755 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268767 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268778 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268790 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268803 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268823 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268836 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268852 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268863 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268875 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268892 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268914 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268930 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268948 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268965 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268981 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.268996 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269012 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269026 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269037 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269050 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269062 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269073 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269085 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269096 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269109 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269122 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269134 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269154 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269180 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269197 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269213 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269227 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269240 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269253 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269266 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269279 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269293 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269305 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269318 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269331 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269343 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269355 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269368 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269379 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269392 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269403 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269413 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269425 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269438 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269450 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269462 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269473 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269485 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269496 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269507 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.269519 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.271103 4776 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.271136 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.271151 4776 reconstruct.go:97] "Volume reconstruction finished" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.271161 4776 reconciler.go:26] "Reconciler: start to sync state" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.293018 4776 manager.go:324] Recovery completed Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.301156 4776 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.303260 4776 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.303312 4776 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.303345 4776 kubelet.go:2335] "Starting kubelet main sync loop" Jan 28 06:50:29 crc kubenswrapper[4776]: E0128 06:50:29.303404 4776 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.304487 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Jan 28 06:50:29 crc kubenswrapper[4776]: E0128 06:50:29.304656 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.305507 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.308070 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.308129 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.308144 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.309524 4776 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.309563 4776 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.309675 4776 state_mem.go:36] "Initialized new in-memory state store" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.323867 4776 policy_none.go:49] "None policy: Start" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.324983 4776 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.325024 4776 state_mem.go:35] "Initializing new in-memory state store" Jan 28 06:50:29 crc kubenswrapper[4776]: E0128 06:50:29.350062 4776 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.388102 4776 manager.go:334] "Starting Device Plugin manager" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.389666 4776 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.389723 4776 server.go:79] "Starting device plugin registration server" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.390285 4776 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.390311 4776 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.390589 4776 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.390743 4776 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.390756 4776 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 28 06:50:29 crc kubenswrapper[4776]: E0128 06:50:29.397127 4776 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.404433 4776 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.404658 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.405967 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.406014 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.406027 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.406198 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.406755 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.406870 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.407680 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.407732 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.407748 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.407939 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.408167 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.408238 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.408541 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.408710 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.408730 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.409116 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.409145 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.409158 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.409280 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.409364 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.409396 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.409408 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.409692 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.409763 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.410471 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.410495 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.410506 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.410742 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.410781 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.410803 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.410816 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.411154 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.411193 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.411756 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.411792 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.411807 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.412039 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.412069 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.412113 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.412138 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.412077 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.413165 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.413203 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.413218 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:29 crc kubenswrapper[4776]: E0128 06:50:29.451620 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="400ms" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.473317 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.473377 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.473406 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.473432 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.473531 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.473592 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.473647 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.473706 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.473751 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.473801 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.473851 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.473971 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.474018 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.474080 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.474129 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.491356 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.492858 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.492930 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.492948 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.492982 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 06:50:29 crc kubenswrapper[4776]: E0128 06:50:29.493517 4776 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.575184 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.575262 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.575295 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.575328 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.575367 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.575397 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.575469 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.575583 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.575495 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.575589 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.575662 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.575800 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.575854 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.575879 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.575944 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.576026 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.576062 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.576096 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.576135 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.576167 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.576197 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.576202 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.576252 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.576254 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.576305 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.576341 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.576370 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.576405 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.576138 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.576276 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.693947 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.699542 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.699632 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.699647 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.699683 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 06:50:29 crc kubenswrapper[4776]: E0128 06:50:29.700492 4776 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.740757 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.770403 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.782928 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.788387 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-83261c517e5fb72d5160de524a84d62ecc3d7a6325f9fe71f94eefe5640e8503 WatchSource:0}: Error finding container 83261c517e5fb72d5160de524a84d62ecc3d7a6325f9fe71f94eefe5640e8503: Status 404 returned error can't find the container with id 83261c517e5fb72d5160de524a84d62ecc3d7a6325f9fe71f94eefe5640e8503 Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.803442 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-af2ae5eb58f2943ef42e0c9de1f727d93f90c2a46ffe97fc9fc6a130ce1ebe14 WatchSource:0}: Error finding container af2ae5eb58f2943ef42e0c9de1f727d93f90c2a46ffe97fc9fc6a130ce1ebe14: Status 404 returned error can't find the container with id af2ae5eb58f2943ef42e0c9de1f727d93f90c2a46ffe97fc9fc6a130ce1ebe14 Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.806864 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-0960de1ecf6eaf7689de4d461e05b594b27455b0a8e463e9e551b6a54e6a0287 WatchSource:0}: Error finding container 0960de1ecf6eaf7689de4d461e05b594b27455b0a8e463e9e551b6a54e6a0287: Status 404 returned error can't find the container with id 0960de1ecf6eaf7689de4d461e05b594b27455b0a8e463e9e551b6a54e6a0287 Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.814524 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: I0128 06:50:29.824596 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.839584 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-adf4369fef6d34cfd838de2f00391e37981e1a077f725dce3c22e1ea9f2edc65 WatchSource:0}: Error finding container adf4369fef6d34cfd838de2f00391e37981e1a077f725dce3c22e1ea9f2edc65: Status 404 returned error can't find the container with id adf4369fef6d34cfd838de2f00391e37981e1a077f725dce3c22e1ea9f2edc65 Jan 28 06:50:29 crc kubenswrapper[4776]: W0128 06:50:29.842215 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-d0319e8e4802971d371d470cf21fd10c26c49f8e67931328c27d248e7ad743e8 WatchSource:0}: Error finding container d0319e8e4802971d371d470cf21fd10c26c49f8e67931328c27d248e7ad743e8: Status 404 returned error can't find the container with id d0319e8e4802971d371d470cf21fd10c26c49f8e67931328c27d248e7ad743e8 Jan 28 06:50:29 crc kubenswrapper[4776]: E0128 06:50:29.853682 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="800ms" Jan 28 06:50:30 crc kubenswrapper[4776]: I0128 06:50:30.101536 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:30 crc kubenswrapper[4776]: I0128 06:50:30.102792 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:30 crc kubenswrapper[4776]: I0128 06:50:30.102846 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:30 crc kubenswrapper[4776]: I0128 06:50:30.102865 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:30 crc kubenswrapper[4776]: I0128 06:50:30.102907 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 06:50:30 crc kubenswrapper[4776]: E0128 06:50:30.103651 4776 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Jan 28 06:50:30 crc kubenswrapper[4776]: I0128 06:50:30.239658 4776 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Jan 28 06:50:30 crc kubenswrapper[4776]: I0128 06:50:30.249900 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 21:35:53.390237477 +0000 UTC Jan 28 06:50:30 crc kubenswrapper[4776]: I0128 06:50:30.312963 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d0319e8e4802971d371d470cf21fd10c26c49f8e67931328c27d248e7ad743e8"} Jan 28 06:50:30 crc kubenswrapper[4776]: I0128 06:50:30.314281 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"adf4369fef6d34cfd838de2f00391e37981e1a077f725dce3c22e1ea9f2edc65"} Jan 28 06:50:30 crc kubenswrapper[4776]: I0128 06:50:30.315418 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0960de1ecf6eaf7689de4d461e05b594b27455b0a8e463e9e551b6a54e6a0287"} Jan 28 06:50:30 crc kubenswrapper[4776]: I0128 06:50:30.316467 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"af2ae5eb58f2943ef42e0c9de1f727d93f90c2a46ffe97fc9fc6a130ce1ebe14"} Jan 28 06:50:30 crc kubenswrapper[4776]: I0128 06:50:30.317641 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"83261c517e5fb72d5160de524a84d62ecc3d7a6325f9fe71f94eefe5640e8503"} Jan 28 06:50:30 crc kubenswrapper[4776]: W0128 06:50:30.506702 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Jan 28 06:50:30 crc kubenswrapper[4776]: E0128 06:50:30.507395 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Jan 28 06:50:30 crc kubenswrapper[4776]: W0128 06:50:30.564256 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Jan 28 06:50:30 crc kubenswrapper[4776]: E0128 06:50:30.564410 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Jan 28 06:50:30 crc kubenswrapper[4776]: E0128 06:50:30.654661 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="1.6s" Jan 28 06:50:30 crc kubenswrapper[4776]: W0128 06:50:30.678775 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Jan 28 06:50:30 crc kubenswrapper[4776]: E0128 06:50:30.678900 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Jan 28 06:50:30 crc kubenswrapper[4776]: W0128 06:50:30.730760 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Jan 28 06:50:30 crc kubenswrapper[4776]: E0128 06:50:30.730922 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Jan 28 06:50:30 crc kubenswrapper[4776]: I0128 06:50:30.904304 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:30 crc kubenswrapper[4776]: I0128 06:50:30.906836 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:30 crc kubenswrapper[4776]: I0128 06:50:30.906904 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:30 crc kubenswrapper[4776]: I0128 06:50:30.906919 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:30 crc kubenswrapper[4776]: I0128 06:50:30.906955 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 06:50:30 crc kubenswrapper[4776]: E0128 06:50:30.907677 4776 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.192713 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 28 06:50:31 crc kubenswrapper[4776]: E0128 06:50:31.194246 4776 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.240294 4776 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.250308 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 03:58:37.278031469 +0000 UTC Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.330752 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4651ce325efa3d8f0e7c1191f4628441eb82f3077fb1f8b66ea4ee0d7a30a020"} Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.330836 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"16ce99972c10bcfd218f38551ccf6e3b1e3f4a29eda068c35edbc88cb7cb6226"} Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.330860 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f869d451c9feadf03362ecb0200659be0d5f5e238d4cbd12b5b4d25400e5478e"} Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.330870 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.330888 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc"} Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.332379 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.332426 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.332438 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.334339 4776 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="402128c3908b422712ff4b4af5dfdd83c28933aba849677082d8a721a4ab8657" exitCode=0 Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.334471 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"402128c3908b422712ff4b4af5dfdd83c28933aba849677082d8a721a4ab8657"} Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.335257 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.336373 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.336418 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.336437 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.337429 4776 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="264e0a9def902bb6d6cd32c0e01bcdfcb11e913e07b46d81f26cc61d731d9162" exitCode=0 Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.337482 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"264e0a9def902bb6d6cd32c0e01bcdfcb11e913e07b46d81f26cc61d731d9162"} Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.337591 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.339429 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.339488 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.339504 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.340437 4776 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="eb75be542bd004823417e6aafbd775ecd08610668fa6af6556b8da48a2088813" exitCode=0 Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.340524 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"eb75be542bd004823417e6aafbd775ecd08610668fa6af6556b8da48a2088813"} Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.340822 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.342384 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.342450 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.342474 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.343212 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133" exitCode=0 Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.343280 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133"} Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.343492 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.345253 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.345285 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.345301 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.350656 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.351766 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.351825 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:31 crc kubenswrapper[4776]: I0128 06:50:31.351846 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:32 crc kubenswrapper[4776]: W0128 06:50:32.193734 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Jan 28 06:50:32 crc kubenswrapper[4776]: E0128 06:50:32.193844 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.239874 4776 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.250587 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 14:21:10.211522228 +0000 UTC Jan 28 06:50:32 crc kubenswrapper[4776]: E0128 06:50:32.255372 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="3.2s" Jan 28 06:50:32 crc kubenswrapper[4776]: W0128 06:50:32.295685 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Jan 28 06:50:32 crc kubenswrapper[4776]: E0128 06:50:32.295763 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.348847 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"aa739ff1424c34a85e3232c1ddd5aa6b3504fcc7acbe984d171eadba62d346af"} Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.348914 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f0ce1ec522bc9b86419f139981846c17270fc2b3bb6a52fe53ee9b307419478d"} Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.348947 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e133f06b9cf06d07e9ff2f9bc286eaf75f8c0cdecde304cbfcb551c43f293d9c"} Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.349065 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.350103 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.350129 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.350141 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.353692 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6"} Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.353723 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba"} Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.353732 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef"} Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.353741 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36"} Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.356411 4776 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="bd4a1b52c43ebe278349f196e5be4c7a1fd4ba19d9355c93b57cbd8e1c7dc364" exitCode=0 Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.356460 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"bd4a1b52c43ebe278349f196e5be4c7a1fd4ba19d9355c93b57cbd8e1c7dc364"} Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.356617 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.357414 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.357446 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.357459 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.360367 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.360389 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"bcbe3f7c17e961828eb62ccb2b796d3b7af87cf500cd2a9cb142238f16e17026"} Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.360443 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.362730 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.362764 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.362777 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.366200 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.366236 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.366252 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.508622 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.510001 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.510045 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.510058 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:32 crc kubenswrapper[4776]: I0128 06:50:32.510092 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 06:50:32 crc kubenswrapper[4776]: E0128 06:50:32.510710 4776 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.250899 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 05:37:05.96583792 +0000 UTC Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.366179 4776 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7d3415c5860dac098474ee1a4de665136b56b9f711d69611a10c60527ab807c4" exitCode=0 Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.366269 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7d3415c5860dac098474ee1a4de665136b56b9f711d69611a10c60527ab807c4"} Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.366436 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.368342 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.368394 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.368412 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.371743 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"87411d785f0295ea402f62fe83bdbf09acdc14bdbd560ac1c3ae84c57ce43075"} Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.373167 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.373506 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.378402 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.378507 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.378583 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.381289 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.381360 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.381987 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.382046 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.382065 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.383437 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.383468 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.383479 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.734333 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.734571 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.735892 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.735964 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:33 crc kubenswrapper[4776]: I0128 06:50:33.735979 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:34 crc kubenswrapper[4776]: I0128 06:50:34.251051 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 18:04:59.914198039 +0000 UTC Jan 28 06:50:34 crc kubenswrapper[4776]: I0128 06:50:34.380747 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"094ca608fdd7e154333a299f35f9a560528e5798689b3880348ec70bfdd06bc6"} Jan 28 06:50:34 crc kubenswrapper[4776]: I0128 06:50:34.380817 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"43d4092ae9fc7a9d65b0b4c82a3b7f19805120104e16820e970d4041d8cb84fd"} Jan 28 06:50:34 crc kubenswrapper[4776]: I0128 06:50:34.380818 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 06:50:34 crc kubenswrapper[4776]: I0128 06:50:34.380899 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:34 crc kubenswrapper[4776]: I0128 06:50:34.380897 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:34 crc kubenswrapper[4776]: I0128 06:50:34.380831 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6d38f15c66993d66901632a288bf4bbf4ca3859f6879119faf6212aa6e6ccf1d"} Jan 28 06:50:34 crc kubenswrapper[4776]: I0128 06:50:34.381199 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"20de31d9a7f7946890a7da22f05b1fa0b01ee324326af9adb653f2500620bba0"} Jan 28 06:50:34 crc kubenswrapper[4776]: I0128 06:50:34.381241 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"186ca55d342fd92ab7a7354ebd43ce847495f10455160d5b1ab79407fc8c2ded"} Jan 28 06:50:34 crc kubenswrapper[4776]: I0128 06:50:34.381966 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:34 crc kubenswrapper[4776]: I0128 06:50:34.382000 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:34 crc kubenswrapper[4776]: I0128 06:50:34.382013 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:34 crc kubenswrapper[4776]: I0128 06:50:34.383051 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:34 crc kubenswrapper[4776]: I0128 06:50:34.383103 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:34 crc kubenswrapper[4776]: I0128 06:50:34.383120 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:34 crc kubenswrapper[4776]: I0128 06:50:34.804767 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.049150 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.049470 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.051440 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.051510 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.051529 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.251473 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 17:17:54.214607708 +0000 UTC Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.382959 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.383718 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.383758 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.383773 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.410645 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.413791 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.462325 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.462530 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.462591 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.463979 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.464033 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.464048 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.521656 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.711080 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.712466 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.712570 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.712583 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:35 crc kubenswrapper[4776]: I0128 06:50:35.712611 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 06:50:36 crc kubenswrapper[4776]: I0128 06:50:36.251668 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 23:47:50.340287671 +0000 UTC Jan 28 06:50:36 crc kubenswrapper[4776]: I0128 06:50:36.386155 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 06:50:36 crc kubenswrapper[4776]: I0128 06:50:36.386231 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:36 crc kubenswrapper[4776]: I0128 06:50:36.386308 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:36 crc kubenswrapper[4776]: I0128 06:50:36.387477 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:36 crc kubenswrapper[4776]: I0128 06:50:36.387527 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:36 crc kubenswrapper[4776]: I0128 06:50:36.387578 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:36 crc kubenswrapper[4776]: I0128 06:50:36.387731 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:36 crc kubenswrapper[4776]: I0128 06:50:36.387798 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:36 crc kubenswrapper[4776]: I0128 06:50:36.387826 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:37 crc kubenswrapper[4776]: I0128 06:50:37.252498 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 17:14:46.469121727 +0000 UTC Jan 28 06:50:37 crc kubenswrapper[4776]: I0128 06:50:37.388747 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:37 crc kubenswrapper[4776]: I0128 06:50:37.389672 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:37 crc kubenswrapper[4776]: I0128 06:50:37.389830 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:37 crc kubenswrapper[4776]: I0128 06:50:37.389925 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:37 crc kubenswrapper[4776]: I0128 06:50:37.507469 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:37 crc kubenswrapper[4776]: I0128 06:50:37.507722 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:37 crc kubenswrapper[4776]: I0128 06:50:37.509148 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:37 crc kubenswrapper[4776]: I0128 06:50:37.509325 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:37 crc kubenswrapper[4776]: I0128 06:50:37.509455 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:38 crc kubenswrapper[4776]: I0128 06:50:38.252928 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 02:44:34.804310608 +0000 UTC Jan 28 06:50:38 crc kubenswrapper[4776]: I0128 06:50:38.733385 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:50:38 crc kubenswrapper[4776]: I0128 06:50:38.733750 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:38 crc kubenswrapper[4776]: I0128 06:50:38.735894 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:38 crc kubenswrapper[4776]: I0128 06:50:38.735969 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:38 crc kubenswrapper[4776]: I0128 06:50:38.735987 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:38 crc kubenswrapper[4776]: I0128 06:50:38.741146 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:50:39 crc kubenswrapper[4776]: I0128 06:50:39.253306 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 16:25:14.02134426 +0000 UTC Jan 28 06:50:39 crc kubenswrapper[4776]: I0128 06:50:39.393912 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:50:39 crc kubenswrapper[4776]: I0128 06:50:39.394052 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:39 crc kubenswrapper[4776]: I0128 06:50:39.396745 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:39 crc kubenswrapper[4776]: I0128 06:50:39.396778 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:39 crc kubenswrapper[4776]: I0128 06:50:39.396788 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:39 crc kubenswrapper[4776]: E0128 06:50:39.397379 4776 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 28 06:50:40 crc kubenswrapper[4776]: I0128 06:50:40.253782 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 17:58:17.976394902 +0000 UTC Jan 28 06:50:40 crc kubenswrapper[4776]: I0128 06:50:40.396691 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:40 crc kubenswrapper[4776]: I0128 06:50:40.398311 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:40 crc kubenswrapper[4776]: I0128 06:50:40.398351 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:40 crc kubenswrapper[4776]: I0128 06:50:40.398362 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:40 crc kubenswrapper[4776]: I0128 06:50:40.404423 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:50:40 crc kubenswrapper[4776]: I0128 06:50:40.596247 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:50:41 crc kubenswrapper[4776]: I0128 06:50:41.253955 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 22:47:12.742588401 +0000 UTC Jan 28 06:50:41 crc kubenswrapper[4776]: I0128 06:50:41.398855 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:41 crc kubenswrapper[4776]: I0128 06:50:41.399914 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:41 crc kubenswrapper[4776]: I0128 06:50:41.399969 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:41 crc kubenswrapper[4776]: I0128 06:50:41.399986 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:42 crc kubenswrapper[4776]: I0128 06:50:42.254193 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 23:00:50.637698205 +0000 UTC Jan 28 06:50:42 crc kubenswrapper[4776]: I0128 06:50:42.401154 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:42 crc kubenswrapper[4776]: I0128 06:50:42.402156 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:42 crc kubenswrapper[4776]: I0128 06:50:42.402262 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:42 crc kubenswrapper[4776]: I0128 06:50:42.402347 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:43 crc kubenswrapper[4776]: W0128 06:50:43.101504 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 28 06:50:43 crc kubenswrapper[4776]: I0128 06:50:43.101731 4776 trace.go:236] Trace[762442334]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Jan-2026 06:50:33.099) (total time: 10002ms): Jan 28 06:50:43 crc kubenswrapper[4776]: Trace[762442334]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:50:43.101) Jan 28 06:50:43 crc kubenswrapper[4776]: Trace[762442334]: [10.002099526s] [10.002099526s] END Jan 28 06:50:43 crc kubenswrapper[4776]: E0128 06:50:43.101776 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 28 06:50:43 crc kubenswrapper[4776]: I0128 06:50:43.241785 4776 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 28 06:50:43 crc kubenswrapper[4776]: I0128 06:50:43.255087 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 03:09:31.978109619 +0000 UTC Jan 28 06:50:43 crc kubenswrapper[4776]: I0128 06:50:43.597245 4776 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 28 06:50:43 crc kubenswrapper[4776]: I0128 06:50:43.597353 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 28 06:50:43 crc kubenswrapper[4776]: W0128 06:50:43.671597 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 28 06:50:43 crc kubenswrapper[4776]: I0128 06:50:43.672351 4776 trace.go:236] Trace[302356824]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Jan-2026 06:50:33.670) (total time: 10002ms): Jan 28 06:50:43 crc kubenswrapper[4776]: Trace[302356824]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:50:43.671) Jan 28 06:50:43 crc kubenswrapper[4776]: Trace[302356824]: [10.002069664s] [10.002069664s] END Jan 28 06:50:43 crc kubenswrapper[4776]: E0128 06:50:43.672515 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 28 06:50:43 crc kubenswrapper[4776]: I0128 06:50:43.756560 4776 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 28 06:50:43 crc kubenswrapper[4776]: I0128 06:50:43.756677 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 28 06:50:43 crc kubenswrapper[4776]: I0128 06:50:43.764647 4776 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 28 06:50:43 crc kubenswrapper[4776]: I0128 06:50:43.764726 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 28 06:50:44 crc kubenswrapper[4776]: I0128 06:50:44.255194 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 23:30:37.932597102 +0000 UTC Jan 28 06:50:44 crc kubenswrapper[4776]: I0128 06:50:44.409850 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 28 06:50:44 crc kubenswrapper[4776]: I0128 06:50:44.412071 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="87411d785f0295ea402f62fe83bdbf09acdc14bdbd560ac1c3ae84c57ce43075" exitCode=255 Jan 28 06:50:44 crc kubenswrapper[4776]: I0128 06:50:44.412110 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"87411d785f0295ea402f62fe83bdbf09acdc14bdbd560ac1c3ae84c57ce43075"} Jan 28 06:50:44 crc kubenswrapper[4776]: I0128 06:50:44.412261 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:44 crc kubenswrapper[4776]: I0128 06:50:44.413055 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:44 crc kubenswrapper[4776]: I0128 06:50:44.413079 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:44 crc kubenswrapper[4776]: I0128 06:50:44.413087 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:44 crc kubenswrapper[4776]: I0128 06:50:44.413491 4776 scope.go:117] "RemoveContainer" containerID="87411d785f0295ea402f62fe83bdbf09acdc14bdbd560ac1c3ae84c57ce43075" Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.185948 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.255443 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 01:37:54.153850483 +0000 UTC Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.416435 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.417134 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.418744 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17" exitCode=255 Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.418847 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.418927 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17"} Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.419067 4776 scope.go:117] "RemoveContainer" containerID="87411d785f0295ea402f62fe83bdbf09acdc14bdbd560ac1c3ae84c57ce43075" Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.419622 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.419659 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.419668 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.420153 4776 scope.go:117] "RemoveContainer" containerID="84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17" Jan 28 06:50:45 crc kubenswrapper[4776]: E0128 06:50:45.420320 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.443346 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.443501 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.445748 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.445795 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.445818 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.462972 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.468343 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:45 crc kubenswrapper[4776]: I0128 06:50:45.526300 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:46 crc kubenswrapper[4776]: I0128 06:50:46.256044 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 07:35:55.464744992 +0000 UTC Jan 28 06:50:46 crc kubenswrapper[4776]: I0128 06:50:46.422187 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 28 06:50:46 crc kubenswrapper[4776]: I0128 06:50:46.423914 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:46 crc kubenswrapper[4776]: I0128 06:50:46.424366 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:46 crc kubenswrapper[4776]: I0128 06:50:46.424949 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:46 crc kubenswrapper[4776]: I0128 06:50:46.424971 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:46 crc kubenswrapper[4776]: I0128 06:50:46.424980 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:46 crc kubenswrapper[4776]: I0128 06:50:46.425399 4776 scope.go:117] "RemoveContainer" containerID="84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17" Jan 28 06:50:46 crc kubenswrapper[4776]: E0128 06:50:46.425562 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 28 06:50:46 crc kubenswrapper[4776]: I0128 06:50:46.425865 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:46 crc kubenswrapper[4776]: I0128 06:50:46.425905 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:46 crc kubenswrapper[4776]: I0128 06:50:46.425915 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:47 crc kubenswrapper[4776]: I0128 06:50:47.256945 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 23:19:37.349709345 +0000 UTC Jan 28 06:50:47 crc kubenswrapper[4776]: I0128 06:50:47.426222 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:47 crc kubenswrapper[4776]: I0128 06:50:47.426923 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:47 crc kubenswrapper[4776]: I0128 06:50:47.426954 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:47 crc kubenswrapper[4776]: I0128 06:50:47.426963 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:47 crc kubenswrapper[4776]: I0128 06:50:47.427450 4776 scope.go:117] "RemoveContainer" containerID="84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17" Jan 28 06:50:47 crc kubenswrapper[4776]: E0128 06:50:47.427628 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 28 06:50:47 crc kubenswrapper[4776]: I0128 06:50:47.507523 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:47 crc kubenswrapper[4776]: I0128 06:50:47.679388 4776 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.232320 4776 apiserver.go:52] "Watching apiserver" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.239223 4776 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.239486 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.239826 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.239864 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.239905 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.239900 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.239932 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.240047 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.240080 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.240106 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.240112 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.244014 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.244027 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.244087 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.244022 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.244508 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.244641 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.244703 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.249408 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.250217 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.251308 4776 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.257318 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 16:54:06.105227946 +0000 UTC Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.270107 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.283432 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.296324 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.308947 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.318169 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.327457 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.341778 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.354016 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.437755 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.437867 4776 scope.go:117] "RemoveContainer" containerID="84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17" Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.438071 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.752206 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.753403 4776 trace.go:236] Trace[1804152852]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Jan-2026 06:50:37.303) (total time: 11449ms): Jan 28 06:50:48 crc kubenswrapper[4776]: Trace[1804152852]: ---"Objects listed" error: 11449ms (06:50:48.753) Jan 28 06:50:48 crc kubenswrapper[4776]: Trace[1804152852]: [11.449520582s] [11.449520582s] END Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.753428 4776 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.754838 4776 trace.go:236] Trace[1211059193]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Jan-2026 06:50:37.464) (total time: 11290ms): Jan 28 06:50:48 crc kubenswrapper[4776]: Trace[1211059193]: ---"Objects listed" error: 11290ms (06:50:48.754) Jan 28 06:50:48 crc kubenswrapper[4776]: Trace[1211059193]: [11.290203083s] [11.290203083s] END Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.754866 4776 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.759006 4776 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.759445 4776 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.763064 4776 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.790635 4776 csr.go:261] certificate signing request csr-nwwb7 is approved, waiting to be issued Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.802331 4776 csr.go:257] certificate signing request csr-nwwb7 is issued Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860078 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860122 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860145 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860162 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860186 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860205 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860220 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860263 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860284 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860309 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860332 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860355 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860468 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860490 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860507 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860525 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860545 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860544 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860602 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860622 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860641 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860672 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860690 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860710 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860727 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860742 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860757 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860774 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860792 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860808 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860859 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860878 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860895 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860912 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860928 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860944 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860964 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860979 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860995 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861013 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861033 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861050 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861096 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861171 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861208 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861235 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861257 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861280 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861334 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861353 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861370 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860590 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860798 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.860846 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861087 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861118 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861169 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861197 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861275 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861295 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861323 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861380 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.861403 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:50:49.361374632 +0000 UTC m=+20.777034882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862378 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862393 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862440 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862472 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862499 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862526 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862575 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862604 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862633 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862659 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862690 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862718 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862746 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862774 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862800 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862825 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862851 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862875 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862903 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862931 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862959 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862988 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863013 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863038 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863059 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863084 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863110 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863134 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863155 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863181 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863201 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863225 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863250 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863273 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863297 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863322 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863345 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863399 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863424 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863447 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863474 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863522 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863548 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863588 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863615 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863637 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863665 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863689 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863714 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863736 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863758 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863782 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863803 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863849 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863873 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863897 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862494 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862582 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861632 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861650 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861662 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861682 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861837 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861971 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862042 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862131 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862145 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862165 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862206 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862235 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862664 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862713 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862788 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862895 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862895 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.862932 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863073 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863069 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863209 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863214 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863241 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863260 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863478 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863549 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863592 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863626 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.861416 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863813 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863845 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863868 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.865212 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.863885 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.864018 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.864144 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.864160 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.864177 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.864398 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.864526 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.864778 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.864795 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.864979 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.865622 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.865641 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.865635 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.865849 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.865867 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.865937 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.865956 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866024 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866065 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866084 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866102 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866125 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866143 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866161 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866179 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866197 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866215 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866233 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866251 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866270 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866287 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866277 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866319 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866340 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866364 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866380 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866407 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866389 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866425 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866421 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866455 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866475 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866478 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866486 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866493 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866599 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866630 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866659 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866686 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866714 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866739 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866762 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866785 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866814 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866839 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866861 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866880 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866901 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866924 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866944 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866966 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.866985 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.867004 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.867710 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.867857 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.867866 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.867900 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.867934 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.867961 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.867992 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.868030 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.868055 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.868058 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.868088 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.868123 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.868153 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.868180 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.868208 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.868331 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.868434 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.869089 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.869374 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.869398 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.869410 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.869805 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.870177 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.870237 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.870294 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.870338 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.870568 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.870370 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.870653 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.870687 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.870718 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.870809 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.870840 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.870872 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.870900 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.870925 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.870948 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.870994 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871018 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871041 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871064 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871085 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871107 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871132 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871152 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871176 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871201 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871219 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871242 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871266 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871288 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871305 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871326 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871348 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871367 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871389 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871413 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871434 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871453 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871475 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871496 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.870838 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871052 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871372 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.871682 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.872035 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.872068 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.875263 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.875943 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.876204 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.876499 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.876591 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.876820 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.877187 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.878673 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.878715 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.879099 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.877236 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.877496 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.877796 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.877705 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.878083 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.878171 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.869119 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.878623 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.879692 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.880165 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.879623 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.880240 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.880503 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.880530 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.880776 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.880782 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.880993 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.881113 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.881282 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.881466 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.881791 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.882075 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.882150 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.882357 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.882376 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.882490 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.882959 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.883014 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.883316 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.883360 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.883702 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.883788 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.884369 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.884792 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.884908 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.885275 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.885281 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.886800 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.887914 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.890076 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.890596 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.891633 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.892895 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.893875 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.893955 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.894002 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.894103 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.894124 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.898422 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.894229 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.894352 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.894358 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.894403 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.894431 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.894520 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.894588 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.894600 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.894669 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.894688 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.894704 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.894991 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.895328 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.895418 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.895487 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.895692 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.895705 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.895710 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.895719 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.895726 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.895975 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.898722 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.898756 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 06:50:49.398733926 +0000 UTC m=+20.814394076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.895831 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.898814 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.898849 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.898872 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.898893 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.898915 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.898936 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.898958 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.898981 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.898999 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899021 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899040 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899059 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899077 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899196 4776 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899208 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899218 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899228 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899241 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899251 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899260 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899273 4776 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899282 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899292 4776 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899301 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899312 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899321 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899330 4776 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899340 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899353 4776 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899362 4776 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899371 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899380 4776 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899393 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899401 4776 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899412 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899423 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899433 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899442 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899451 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899462 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899471 4776 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899480 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899489 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899500 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899509 4776 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899518 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899528 4776 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899538 4776 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899551 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899590 4776 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899603 4776 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899613 4776 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899623 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899633 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899649 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899658 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899668 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899678 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899690 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899698 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899707 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899719 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899728 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899737 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899745 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899757 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899766 4776 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899776 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899785 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899796 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899805 4776 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899814 4776 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899822 4776 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899834 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899842 4776 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899853 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899865 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899875 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899886 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899894 4776 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899907 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899916 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899926 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899934 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899947 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899957 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899966 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899977 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899986 4776 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.899996 4776 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900005 4776 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900017 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900026 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900037 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900072 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900083 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900093 4776 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900101 4776 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900111 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900123 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900132 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900130 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900148 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900160 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900169 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900178 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900187 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900188 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900199 4776 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900209 4776 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900218 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900228 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900239 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900249 4776 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900258 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900267 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900281 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900290 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900298 4776 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900309 4776 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900319 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900327 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900330 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900336 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900357 4776 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900368 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900383 4776 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900394 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900403 4776 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900412 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900430 4776 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900453 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900463 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900475 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900484 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900493 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900502 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900513 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900522 4776 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900531 4776 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900541 4776 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900564 4776 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900572 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900581 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900591 4776 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900600 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900608 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900617 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900628 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900638 4776 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900647 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.896105 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900658 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900670 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900680 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900689 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900698 4776 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900709 4776 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900717 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900725 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900736 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900745 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900754 4776 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900764 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900775 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900784 4776 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900792 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.896178 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.896224 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900811 4776 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900828 4776 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900838 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900848 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900860 4776 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900870 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900880 4776 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900889 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900902 4776 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900912 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900920 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900929 4776 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900939 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900947 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900954 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900963 4776 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900974 4776 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.900982 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.896799 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.896977 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.897485 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.897600 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.897859 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.885516 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.898177 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.898231 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.902453 4776 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.903997 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.904578 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.905254 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.905337 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 06:50:49.405317257 +0000 UTC m=+20.820977417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.905523 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.906039 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.906304 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.906439 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.906883 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.909095 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.912311 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.912342 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.912359 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.912417 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 06:50:49.412400731 +0000 UTC m=+20.828060891 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.912485 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.913051 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.914159 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.914180 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.914191 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:48 crc kubenswrapper[4776]: E0128 06:50:48.914222 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 06:50:49.414213406 +0000 UTC m=+20.829873566 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.914494 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.914778 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.917706 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.917770 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.922460 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.924379 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.926222 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.929793 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.942267 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.942656 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.952335 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-2wlgk"] Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.952865 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2wlgk" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.956307 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.956410 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.963865 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.976273 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.986593 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:48 crc kubenswrapper[4776]: I0128 06:50:48.998111 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002424 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002476 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmqxv\" (UniqueName: \"kubernetes.io/projected/42dfc5af-3617-4121-9b26-9593c827a536-kube-api-access-pmqxv\") pod \"node-resolver-2wlgk\" (UID: \"42dfc5af-3617-4121-9b26-9593c827a536\") " pod="openshift-dns/node-resolver-2wlgk" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002503 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002542 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/42dfc5af-3617-4121-9b26-9593c827a536-hosts-file\") pod \"node-resolver-2wlgk\" (UID: \"42dfc5af-3617-4121-9b26-9593c827a536\") " pod="openshift-dns/node-resolver-2wlgk" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002634 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002651 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002657 4776 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002686 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002697 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002706 4776 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002716 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002725 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002734 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002743 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002753 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002762 4776 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002771 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002780 4776 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002790 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002799 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002812 4776 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002823 4776 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002833 4776 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002844 4776 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002855 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002864 4776 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002874 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002891 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002901 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002912 4776 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002922 4776 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.002932 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.009025 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.022403 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.031982 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.039815 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.052432 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.081482 4776 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 28 06:50:49 crc kubenswrapper[4776]: W0128 06:50:49.081925 4776 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: very short watch: object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": Unexpected watch close - watch lasted less than a second and no items received Jan 28 06:50:49 crc kubenswrapper[4776]: W0128 06:50:49.082019 4776 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 28 06:50:49 crc kubenswrapper[4776]: W0128 06:50:49.081926 4776 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 28 06:50:49 crc kubenswrapper[4776]: W0128 06:50:49.082072 4776 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 28 06:50:49 crc kubenswrapper[4776]: W0128 06:50:49.082036 4776 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Jan 28 06:50:49 crc kubenswrapper[4776]: W0128 06:50:49.082074 4776 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.082002 4776 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": read tcp 38.102.83.195:60372->38.102.83.195:6443: use of closed network connection" event="&Event{ObjectMeta:{kube-apiserver-crc.188ed26466d95dfe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-28 06:50:32.267267582 +0000 UTC m=+3.682927742,LastTimestamp:2026-01-28 06:50:32.267267582 +0000 UTC m=+3.682927742,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 28 06:50:49 crc kubenswrapper[4776]: W0128 06:50:49.082143 4776 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 28 06:50:49 crc kubenswrapper[4776]: W0128 06:50:49.081926 4776 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Jan 28 06:50:49 crc kubenswrapper[4776]: W0128 06:50:49.081944 4776 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 28 06:50:49 crc kubenswrapper[4776]: W0128 06:50:49.082101 4776 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Jan 28 06:50:49 crc kubenswrapper[4776]: W0128 06:50:49.082114 4776 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Jan 28 06:50:49 crc kubenswrapper[4776]: W0128 06:50:49.082205 4776 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 28 06:50:49 crc kubenswrapper[4776]: W0128 06:50:49.082146 4776 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 28 06:50:49 crc kubenswrapper[4776]: W0128 06:50:49.082457 4776 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.103525 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/42dfc5af-3617-4121-9b26-9593c827a536-hosts-file\") pod \"node-resolver-2wlgk\" (UID: \"42dfc5af-3617-4121-9b26-9593c827a536\") " pod="openshift-dns/node-resolver-2wlgk" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.103600 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmqxv\" (UniqueName: \"kubernetes.io/projected/42dfc5af-3617-4121-9b26-9593c827a536-kube-api-access-pmqxv\") pod \"node-resolver-2wlgk\" (UID: \"42dfc5af-3617-4121-9b26-9593c827a536\") " pod="openshift-dns/node-resolver-2wlgk" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.103857 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/42dfc5af-3617-4121-9b26-9593c827a536-hosts-file\") pod \"node-resolver-2wlgk\" (UID: \"42dfc5af-3617-4121-9b26-9593c827a536\") " pod="openshift-dns/node-resolver-2wlgk" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.120300 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmqxv\" (UniqueName: \"kubernetes.io/projected/42dfc5af-3617-4121-9b26-9593c827a536-kube-api-access-pmqxv\") pod \"node-resolver-2wlgk\" (UID: \"42dfc5af-3617-4121-9b26-9593c827a536\") " pod="openshift-dns/node-resolver-2wlgk" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.155826 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.161763 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.168782 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 06:50:49 crc kubenswrapper[4776]: W0128 06:50:49.170782 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-db0b98df5d790fa823a4099b8e3df207d2cd702671299e7e775a120980c378b9 WatchSource:0}: Error finding container db0b98df5d790fa823a4099b8e3df207d2cd702671299e7e775a120980c378b9: Status 404 returned error can't find the container with id db0b98df5d790fa823a4099b8e3df207d2cd702671299e7e775a120980c378b9 Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.176154 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.176513 4776 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 28 06:50:49 crc kubenswrapper[4776]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Jan 28 06:50:49 crc kubenswrapper[4776]: if [[ -f "/env/_master" ]]; then Jan 28 06:50:49 crc kubenswrapper[4776]: set -o allexport Jan 28 06:50:49 crc kubenswrapper[4776]: source "/env/_master" Jan 28 06:50:49 crc kubenswrapper[4776]: set +o allexport Jan 28 06:50:49 crc kubenswrapper[4776]: fi Jan 28 06:50:49 crc kubenswrapper[4776]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Jan 28 06:50:49 crc kubenswrapper[4776]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Jan 28 06:50:49 crc kubenswrapper[4776]: ho_enable="--enable-hybrid-overlay" Jan 28 06:50:49 crc kubenswrapper[4776]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Jan 28 06:50:49 crc kubenswrapper[4776]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Jan 28 06:50:49 crc kubenswrapper[4776]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Jan 28 06:50:49 crc kubenswrapper[4776]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Jan 28 06:50:49 crc kubenswrapper[4776]: --webhook-cert-dir="/etc/webhook-cert" \ Jan 28 06:50:49 crc kubenswrapper[4776]: --webhook-host=127.0.0.1 \ Jan 28 06:50:49 crc kubenswrapper[4776]: --webhook-port=9743 \ Jan 28 06:50:49 crc kubenswrapper[4776]: ${ho_enable} \ Jan 28 06:50:49 crc kubenswrapper[4776]: --enable-interconnect \ Jan 28 06:50:49 crc kubenswrapper[4776]: --disable-approver \ Jan 28 06:50:49 crc kubenswrapper[4776]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Jan 28 06:50:49 crc kubenswrapper[4776]: --wait-for-kubernetes-api=200s \ Jan 28 06:50:49 crc kubenswrapper[4776]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Jan 28 06:50:49 crc kubenswrapper[4776]: --loglevel="${LOGLEVEL}" Jan 28 06:50:49 crc kubenswrapper[4776]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 28 06:50:49 crc kubenswrapper[4776]: > logger="UnhandledError" Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.180691 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.184992 4776 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 28 06:50:49 crc kubenswrapper[4776]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Jan 28 06:50:49 crc kubenswrapper[4776]: if [[ -f "/env/_master" ]]; then Jan 28 06:50:49 crc kubenswrapper[4776]: set -o allexport Jan 28 06:50:49 crc kubenswrapper[4776]: source "/env/_master" Jan 28 06:50:49 crc kubenswrapper[4776]: set +o allexport Jan 28 06:50:49 crc kubenswrapper[4776]: fi Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Jan 28 06:50:49 crc kubenswrapper[4776]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Jan 28 06:50:49 crc kubenswrapper[4776]: --disable-webhook \ Jan 28 06:50:49 crc kubenswrapper[4776]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Jan 28 06:50:49 crc kubenswrapper[4776]: --loglevel="${LOGLEVEL}" Jan 28 06:50:49 crc kubenswrapper[4776]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 28 06:50:49 crc kubenswrapper[4776]: > logger="UnhandledError" Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.186181 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Jan 28 06:50:49 crc kubenswrapper[4776]: W0128 06:50:49.186812 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-c9b1301ccfe8d22eee9cafa8561521520d66b94fb686a60f0727c263bf478873 WatchSource:0}: Error finding container c9b1301ccfe8d22eee9cafa8561521520d66b94fb686a60f0727c263bf478873: Status 404 returned error can't find the container with id c9b1301ccfe8d22eee9cafa8561521520d66b94fb686a60f0727c263bf478873 Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.188908 4776 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 28 06:50:49 crc kubenswrapper[4776]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Jan 28 06:50:49 crc kubenswrapper[4776]: set -o allexport Jan 28 06:50:49 crc kubenswrapper[4776]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Jan 28 06:50:49 crc kubenswrapper[4776]: source /etc/kubernetes/apiserver-url.env Jan 28 06:50:49 crc kubenswrapper[4776]: else Jan 28 06:50:49 crc kubenswrapper[4776]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Jan 28 06:50:49 crc kubenswrapper[4776]: exit 1 Jan 28 06:50:49 crc kubenswrapper[4776]: fi Jan 28 06:50:49 crc kubenswrapper[4776]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Jan 28 06:50:49 crc kubenswrapper[4776]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 28 06:50:49 crc kubenswrapper[4776]: > logger="UnhandledError" Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.190077 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.257915 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 00:15:23.42134613 +0000 UTC Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.272577 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2wlgk" Jan 28 06:50:49 crc kubenswrapper[4776]: W0128 06:50:49.282123 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42dfc5af_3617_4121_9b26_9593c827a536.slice/crio-13f0ce6595276c7ea1a3f3d1fb9027253dcc5f31966ee6b2ce072d834d8038c4 WatchSource:0}: Error finding container 13f0ce6595276c7ea1a3f3d1fb9027253dcc5f31966ee6b2ce072d834d8038c4: Status 404 returned error can't find the container with id 13f0ce6595276c7ea1a3f3d1fb9027253dcc5f31966ee6b2ce072d834d8038c4 Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.283793 4776 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 28 06:50:49 crc kubenswrapper[4776]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Jan 28 06:50:49 crc kubenswrapper[4776]: set -uo pipefail Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Jan 28 06:50:49 crc kubenswrapper[4776]: HOSTS_FILE="/etc/hosts" Jan 28 06:50:49 crc kubenswrapper[4776]: TEMP_FILE="/etc/hosts.tmp" Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: IFS=', ' read -r -a services <<< "${SERVICES}" Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: # Make a temporary file with the old hosts file's attributes. Jan 28 06:50:49 crc kubenswrapper[4776]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Jan 28 06:50:49 crc kubenswrapper[4776]: echo "Failed to preserve hosts file. Exiting." Jan 28 06:50:49 crc kubenswrapper[4776]: exit 1 Jan 28 06:50:49 crc kubenswrapper[4776]: fi Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: while true; do Jan 28 06:50:49 crc kubenswrapper[4776]: declare -A svc_ips Jan 28 06:50:49 crc kubenswrapper[4776]: for svc in "${services[@]}"; do Jan 28 06:50:49 crc kubenswrapper[4776]: # Fetch service IP from cluster dns if present. We make several tries Jan 28 06:50:49 crc kubenswrapper[4776]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Jan 28 06:50:49 crc kubenswrapper[4776]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Jan 28 06:50:49 crc kubenswrapper[4776]: # support UDP loadbalancers and require reaching DNS through TCP. Jan 28 06:50:49 crc kubenswrapper[4776]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 28 06:50:49 crc kubenswrapper[4776]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 28 06:50:49 crc kubenswrapper[4776]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 28 06:50:49 crc kubenswrapper[4776]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Jan 28 06:50:49 crc kubenswrapper[4776]: for i in ${!cmds[*]} Jan 28 06:50:49 crc kubenswrapper[4776]: do Jan 28 06:50:49 crc kubenswrapper[4776]: ips=($(eval "${cmds[i]}")) Jan 28 06:50:49 crc kubenswrapper[4776]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Jan 28 06:50:49 crc kubenswrapper[4776]: svc_ips["${svc}"]="${ips[@]}" Jan 28 06:50:49 crc kubenswrapper[4776]: break Jan 28 06:50:49 crc kubenswrapper[4776]: fi Jan 28 06:50:49 crc kubenswrapper[4776]: done Jan 28 06:50:49 crc kubenswrapper[4776]: done Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: # Update /etc/hosts only if we get valid service IPs Jan 28 06:50:49 crc kubenswrapper[4776]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Jan 28 06:50:49 crc kubenswrapper[4776]: # Stale entries could exist in /etc/hosts if the service is deleted Jan 28 06:50:49 crc kubenswrapper[4776]: if [[ -n "${svc_ips[*]-}" ]]; then Jan 28 06:50:49 crc kubenswrapper[4776]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Jan 28 06:50:49 crc kubenswrapper[4776]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Jan 28 06:50:49 crc kubenswrapper[4776]: # Only continue rebuilding the hosts entries if its original content is preserved Jan 28 06:50:49 crc kubenswrapper[4776]: sleep 60 & wait Jan 28 06:50:49 crc kubenswrapper[4776]: continue Jan 28 06:50:49 crc kubenswrapper[4776]: fi Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: # Append resolver entries for services Jan 28 06:50:49 crc kubenswrapper[4776]: rc=0 Jan 28 06:50:49 crc kubenswrapper[4776]: for svc in "${!svc_ips[@]}"; do Jan 28 06:50:49 crc kubenswrapper[4776]: for ip in ${svc_ips[${svc}]}; do Jan 28 06:50:49 crc kubenswrapper[4776]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Jan 28 06:50:49 crc kubenswrapper[4776]: done Jan 28 06:50:49 crc kubenswrapper[4776]: done Jan 28 06:50:49 crc kubenswrapper[4776]: if [[ $rc -ne 0 ]]; then Jan 28 06:50:49 crc kubenswrapper[4776]: sleep 60 & wait Jan 28 06:50:49 crc kubenswrapper[4776]: continue Jan 28 06:50:49 crc kubenswrapper[4776]: fi Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Jan 28 06:50:49 crc kubenswrapper[4776]: # Replace /etc/hosts with our modified version if needed Jan 28 06:50:49 crc kubenswrapper[4776]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Jan 28 06:50:49 crc kubenswrapper[4776]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Jan 28 06:50:49 crc kubenswrapper[4776]: fi Jan 28 06:50:49 crc kubenswrapper[4776]: sleep 60 & wait Jan 28 06:50:49 crc kubenswrapper[4776]: unset svc_ips Jan 28 06:50:49 crc kubenswrapper[4776]: done Jan 28 06:50:49 crc kubenswrapper[4776]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pmqxv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-2wlgk_openshift-dns(42dfc5af-3617-4121-9b26-9593c827a536): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 28 06:50:49 crc kubenswrapper[4776]: > logger="UnhandledError" Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.284963 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-2wlgk" podUID="42dfc5af-3617-4121-9b26-9593c827a536" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.304265 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.304405 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.308177 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.308776 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.310086 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.310774 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.311888 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.312457 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.313262 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.314384 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.314995 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.315177 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.316336 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.316923 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.319949 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.320634 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.321257 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.321844 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.323298 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.323899 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.324347 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.325347 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.325989 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.326592 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.326656 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.327607 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.328093 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.329343 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.329887 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.331011 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.331701 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.332571 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.333117 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.333977 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.334471 4776 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.334680 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.334663 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.336371 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.337403 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.337986 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.339862 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.341016 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.341729 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.343059 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.343993 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.343994 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.345146 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.345981 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.347179 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.347936 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.349131 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.353312 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.354186 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.354478 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.355979 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.356486 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.357526 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.358163 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.358996 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.360636 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.361255 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.365879 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.375662 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.391592 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.406182 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.406406 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:50:50.406385752 +0000 UTC m=+21.822045922 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.406620 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.406662 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.406721 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.406762 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 06:50:50.406753061 +0000 UTC m=+21.822413221 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.406773 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.406823 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 06:50:50.406800502 +0000 UTC m=+21.822460662 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.431220 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c9b1301ccfe8d22eee9cafa8561521520d66b94fb686a60f0727c263bf478873"} Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.432031 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"db0b98df5d790fa823a4099b8e3df207d2cd702671299e7e775a120980c378b9"} Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.432885 4776 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 28 06:50:49 crc kubenswrapper[4776]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Jan 28 06:50:49 crc kubenswrapper[4776]: set -o allexport Jan 28 06:50:49 crc kubenswrapper[4776]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Jan 28 06:50:49 crc kubenswrapper[4776]: source /etc/kubernetes/apiserver-url.env Jan 28 06:50:49 crc kubenswrapper[4776]: else Jan 28 06:50:49 crc kubenswrapper[4776]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Jan 28 06:50:49 crc kubenswrapper[4776]: exit 1 Jan 28 06:50:49 crc kubenswrapper[4776]: fi Jan 28 06:50:49 crc kubenswrapper[4776]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Jan 28 06:50:49 crc kubenswrapper[4776]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 28 06:50:49 crc kubenswrapper[4776]: > logger="UnhandledError" Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.433485 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.433706 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2wlgk" event={"ID":"42dfc5af-3617-4121-9b26-9593c827a536","Type":"ContainerStarted","Data":"13f0ce6595276c7ea1a3f3d1fb9027253dcc5f31966ee6b2ce072d834d8038c4"} Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.434306 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.434802 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.435003 4776 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 28 06:50:49 crc kubenswrapper[4776]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Jan 28 06:50:49 crc kubenswrapper[4776]: set -uo pipefail Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Jan 28 06:50:49 crc kubenswrapper[4776]: HOSTS_FILE="/etc/hosts" Jan 28 06:50:49 crc kubenswrapper[4776]: TEMP_FILE="/etc/hosts.tmp" Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: IFS=', ' read -r -a services <<< "${SERVICES}" Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: # Make a temporary file with the old hosts file's attributes. Jan 28 06:50:49 crc kubenswrapper[4776]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Jan 28 06:50:49 crc kubenswrapper[4776]: echo "Failed to preserve hosts file. Exiting." Jan 28 06:50:49 crc kubenswrapper[4776]: exit 1 Jan 28 06:50:49 crc kubenswrapper[4776]: fi Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: while true; do Jan 28 06:50:49 crc kubenswrapper[4776]: declare -A svc_ips Jan 28 06:50:49 crc kubenswrapper[4776]: for svc in "${services[@]}"; do Jan 28 06:50:49 crc kubenswrapper[4776]: # Fetch service IP from cluster dns if present. We make several tries Jan 28 06:50:49 crc kubenswrapper[4776]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Jan 28 06:50:49 crc kubenswrapper[4776]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Jan 28 06:50:49 crc kubenswrapper[4776]: # support UDP loadbalancers and require reaching DNS through TCP. Jan 28 06:50:49 crc kubenswrapper[4776]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 28 06:50:49 crc kubenswrapper[4776]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 28 06:50:49 crc kubenswrapper[4776]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 28 06:50:49 crc kubenswrapper[4776]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Jan 28 06:50:49 crc kubenswrapper[4776]: for i in ${!cmds[*]} Jan 28 06:50:49 crc kubenswrapper[4776]: do Jan 28 06:50:49 crc kubenswrapper[4776]: ips=($(eval "${cmds[i]}")) Jan 28 06:50:49 crc kubenswrapper[4776]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Jan 28 06:50:49 crc kubenswrapper[4776]: svc_ips["${svc}"]="${ips[@]}" Jan 28 06:50:49 crc kubenswrapper[4776]: break Jan 28 06:50:49 crc kubenswrapper[4776]: fi Jan 28 06:50:49 crc kubenswrapper[4776]: done Jan 28 06:50:49 crc kubenswrapper[4776]: done Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: # Update /etc/hosts only if we get valid service IPs Jan 28 06:50:49 crc kubenswrapper[4776]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Jan 28 06:50:49 crc kubenswrapper[4776]: # Stale entries could exist in /etc/hosts if the service is deleted Jan 28 06:50:49 crc kubenswrapper[4776]: if [[ -n "${svc_ips[*]-}" ]]; then Jan 28 06:50:49 crc kubenswrapper[4776]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Jan 28 06:50:49 crc kubenswrapper[4776]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Jan 28 06:50:49 crc kubenswrapper[4776]: # Only continue rebuilding the hosts entries if its original content is preserved Jan 28 06:50:49 crc kubenswrapper[4776]: sleep 60 & wait Jan 28 06:50:49 crc kubenswrapper[4776]: continue Jan 28 06:50:49 crc kubenswrapper[4776]: fi Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: # Append resolver entries for services Jan 28 06:50:49 crc kubenswrapper[4776]: rc=0 Jan 28 06:50:49 crc kubenswrapper[4776]: for svc in "${!svc_ips[@]}"; do Jan 28 06:50:49 crc kubenswrapper[4776]: for ip in ${svc_ips[${svc}]}; do Jan 28 06:50:49 crc kubenswrapper[4776]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Jan 28 06:50:49 crc kubenswrapper[4776]: done Jan 28 06:50:49 crc kubenswrapper[4776]: done Jan 28 06:50:49 crc kubenswrapper[4776]: if [[ $rc -ne 0 ]]; then Jan 28 06:50:49 crc kubenswrapper[4776]: sleep 60 & wait Jan 28 06:50:49 crc kubenswrapper[4776]: continue Jan 28 06:50:49 crc kubenswrapper[4776]: fi Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Jan 28 06:50:49 crc kubenswrapper[4776]: # Replace /etc/hosts with our modified version if needed Jan 28 06:50:49 crc kubenswrapper[4776]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Jan 28 06:50:49 crc kubenswrapper[4776]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Jan 28 06:50:49 crc kubenswrapper[4776]: fi Jan 28 06:50:49 crc kubenswrapper[4776]: sleep 60 & wait Jan 28 06:50:49 crc kubenswrapper[4776]: unset svc_ips Jan 28 06:50:49 crc kubenswrapper[4776]: done Jan 28 06:50:49 crc kubenswrapper[4776]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pmqxv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-2wlgk_openshift-dns(42dfc5af-3617-4121-9b26-9593c827a536): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 28 06:50:49 crc kubenswrapper[4776]: > logger="UnhandledError" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.435234 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"04819b8773d9748754c7d81234e2116435dbfbb942a738284f5376d139a3b74b"} Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.435799 4776 scope.go:117] "RemoveContainer" containerID="84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17" Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.435949 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.436422 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-2wlgk" podUID="42dfc5af-3617-4121-9b26-9593c827a536" Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.436625 4776 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 28 06:50:49 crc kubenswrapper[4776]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Jan 28 06:50:49 crc kubenswrapper[4776]: if [[ -f "/env/_master" ]]; then Jan 28 06:50:49 crc kubenswrapper[4776]: set -o allexport Jan 28 06:50:49 crc kubenswrapper[4776]: source "/env/_master" Jan 28 06:50:49 crc kubenswrapper[4776]: set +o allexport Jan 28 06:50:49 crc kubenswrapper[4776]: fi Jan 28 06:50:49 crc kubenswrapper[4776]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Jan 28 06:50:49 crc kubenswrapper[4776]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Jan 28 06:50:49 crc kubenswrapper[4776]: ho_enable="--enable-hybrid-overlay" Jan 28 06:50:49 crc kubenswrapper[4776]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Jan 28 06:50:49 crc kubenswrapper[4776]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Jan 28 06:50:49 crc kubenswrapper[4776]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Jan 28 06:50:49 crc kubenswrapper[4776]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Jan 28 06:50:49 crc kubenswrapper[4776]: --webhook-cert-dir="/etc/webhook-cert" \ Jan 28 06:50:49 crc kubenswrapper[4776]: --webhook-host=127.0.0.1 \ Jan 28 06:50:49 crc kubenswrapper[4776]: --webhook-port=9743 \ Jan 28 06:50:49 crc kubenswrapper[4776]: ${ho_enable} \ Jan 28 06:50:49 crc kubenswrapper[4776]: --enable-interconnect \ Jan 28 06:50:49 crc kubenswrapper[4776]: --disable-approver \ Jan 28 06:50:49 crc kubenswrapper[4776]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Jan 28 06:50:49 crc kubenswrapper[4776]: --wait-for-kubernetes-api=200s \ Jan 28 06:50:49 crc kubenswrapper[4776]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Jan 28 06:50:49 crc kubenswrapper[4776]: --loglevel="${LOGLEVEL}" Jan 28 06:50:49 crc kubenswrapper[4776]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 28 06:50:49 crc kubenswrapper[4776]: > logger="UnhandledError" Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.438392 4776 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 28 06:50:49 crc kubenswrapper[4776]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Jan 28 06:50:49 crc kubenswrapper[4776]: if [[ -f "/env/_master" ]]; then Jan 28 06:50:49 crc kubenswrapper[4776]: set -o allexport Jan 28 06:50:49 crc kubenswrapper[4776]: source "/env/_master" Jan 28 06:50:49 crc kubenswrapper[4776]: set +o allexport Jan 28 06:50:49 crc kubenswrapper[4776]: fi Jan 28 06:50:49 crc kubenswrapper[4776]: Jan 28 06:50:49 crc kubenswrapper[4776]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Jan 28 06:50:49 crc kubenswrapper[4776]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Jan 28 06:50:49 crc kubenswrapper[4776]: --disable-webhook \ Jan 28 06:50:49 crc kubenswrapper[4776]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Jan 28 06:50:49 crc kubenswrapper[4776]: --loglevel="${LOGLEVEL}" Jan 28 06:50:49 crc kubenswrapper[4776]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 28 06:50:49 crc kubenswrapper[4776]: > logger="UnhandledError" Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.439608 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.444403 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.469323 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.481425 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.490772 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.500068 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.507673 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.507786 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.507847 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.507869 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.507882 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.507921 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 06:50:50.507907697 +0000 UTC m=+21.923567847 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.507973 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.507982 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.507988 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:49 crc kubenswrapper[4776]: E0128 06:50:49.508007 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 06:50:50.508001539 +0000 UTC m=+21.923661699 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.509024 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.516169 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.522125 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.530697 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.536900 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.545264 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.554402 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.567548 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.591162 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.605128 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.622309 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.682493 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-stl56"] Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.682881 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-hmlx4"] Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.683049 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.683703 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-mng44"] Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.683987 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.684048 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.685034 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.685464 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.685980 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.686067 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.686280 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.686342 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.686604 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.686783 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.686826 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.686906 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.687029 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.688103 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.699398 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.712941 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.723092 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.733756 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.742249 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.750968 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.764431 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.772887 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.779349 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.786492 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.793523 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.804121 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-28 06:45:48 +0000 UTC, rotation deadline is 2026-12-18 20:45:00.771563546 +0000 UTC Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.804192 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7789h54m10.967373337s for next certificate rotation Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.806254 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.810445 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/307c371c-8938-4d8c-826c-a682302f1003-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.810630 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-run-netns\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.810710 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/307c371c-8938-4d8c-826c-a682302f1003-system-cni-dir\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.810731 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe4cd320-31b6-43af-a080-c8b4855a1a79-cni-binary-copy\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.810946 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-multus-socket-dir-parent\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.810979 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-var-lib-cni-bin\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811019 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-os-release\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811052 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/307c371c-8938-4d8c-826c-a682302f1003-cni-binary-copy\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811095 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-var-lib-kubelet\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811129 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-multus-conf-dir\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811155 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fe4cd320-31b6-43af-a080-c8b4855a1a79-multus-daemon-config\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811207 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-run-k8s-cni-cncf-io\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811239 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/307c371c-8938-4d8c-826c-a682302f1003-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811261 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-722rr\" (UniqueName: \"kubernetes.io/projected/307c371c-8938-4d8c-826c-a682302f1003-kube-api-access-722rr\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811300 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-hostroot\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811321 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-run-multus-certs\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811344 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-cnibin\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811363 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/307c371c-8938-4d8c-826c-a682302f1003-os-release\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811388 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-system-cni-dir\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811423 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-var-lib-cni-multus\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811457 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3539113f-fe53-40a0-a08c-d7f86951d067-proxy-tls\") pod \"machine-config-daemon-stl56\" (UID: \"3539113f-fe53-40a0-a08c-d7f86951d067\") " pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811504 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3539113f-fe53-40a0-a08c-d7f86951d067-mcd-auth-proxy-config\") pod \"machine-config-daemon-stl56\" (UID: \"3539113f-fe53-40a0-a08c-d7f86951d067\") " pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811523 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/307c371c-8938-4d8c-826c-a682302f1003-cnibin\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811598 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-multus-cni-dir\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811642 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-etc-kubernetes\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811671 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7x6m\" (UniqueName: \"kubernetes.io/projected/fe4cd320-31b6-43af-a080-c8b4855a1a79-kube-api-access-s7x6m\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811717 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3539113f-fe53-40a0-a08c-d7f86951d067-rootfs\") pod \"machine-config-daemon-stl56\" (UID: \"3539113f-fe53-40a0-a08c-d7f86951d067\") " pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.811748 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmvjw\" (UniqueName: \"kubernetes.io/projected/3539113f-fe53-40a0-a08c-d7f86951d067-kube-api-access-dmvjw\") pod \"machine-config-daemon-stl56\" (UID: \"3539113f-fe53-40a0-a08c-d7f86951d067\") " pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.820215 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.830608 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.840823 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.849350 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.856639 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.868809 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.878933 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.887902 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.912419 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-run-netns\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.912465 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/307c371c-8938-4d8c-826c-a682302f1003-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.912482 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe4cd320-31b6-43af-a080-c8b4855a1a79-cni-binary-copy\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.912498 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-multus-socket-dir-parent\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.912515 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-var-lib-cni-bin\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.912538 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/307c371c-8938-4d8c-826c-a682302f1003-system-cni-dir\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.912592 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-os-release\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.912680 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-run-netns\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.912718 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-multus-socket-dir-parent\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.912763 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/307c371c-8938-4d8c-826c-a682302f1003-system-cni-dir\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.912928 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-var-lib-cni-bin\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.912958 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-os-release\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913270 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/307c371c-8938-4d8c-826c-a682302f1003-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913422 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe4cd320-31b6-43af-a080-c8b4855a1a79-cni-binary-copy\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913444 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/307c371c-8938-4d8c-826c-a682302f1003-cni-binary-copy\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.912614 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/307c371c-8938-4d8c-826c-a682302f1003-cni-binary-copy\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913511 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-var-lib-kubelet\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913551 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-var-lib-kubelet\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913575 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-multus-conf-dir\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913595 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-multus-conf-dir\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913595 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fe4cd320-31b6-43af-a080-c8b4855a1a79-multus-daemon-config\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913636 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-run-k8s-cni-cncf-io\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913658 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/307c371c-8938-4d8c-826c-a682302f1003-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913677 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-722rr\" (UniqueName: \"kubernetes.io/projected/307c371c-8938-4d8c-826c-a682302f1003-kube-api-access-722rr\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913696 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-run-multus-certs\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913724 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-hostroot\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913739 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-cnibin\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913754 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/307c371c-8938-4d8c-826c-a682302f1003-os-release\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913769 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3539113f-fe53-40a0-a08c-d7f86951d067-proxy-tls\") pod \"machine-config-daemon-stl56\" (UID: \"3539113f-fe53-40a0-a08c-d7f86951d067\") " pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913786 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-system-cni-dir\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913802 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-var-lib-cni-multus\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913832 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3539113f-fe53-40a0-a08c-d7f86951d067-mcd-auth-proxy-config\") pod \"machine-config-daemon-stl56\" (UID: \"3539113f-fe53-40a0-a08c-d7f86951d067\") " pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913846 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7x6m\" (UniqueName: \"kubernetes.io/projected/fe4cd320-31b6-43af-a080-c8b4855a1a79-kube-api-access-s7x6m\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913865 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/307c371c-8938-4d8c-826c-a682302f1003-cnibin\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913880 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-multus-cni-dir\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913895 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-etc-kubernetes\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913929 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3539113f-fe53-40a0-a08c-d7f86951d067-rootfs\") pod \"machine-config-daemon-stl56\" (UID: \"3539113f-fe53-40a0-a08c-d7f86951d067\") " pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.913955 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmvjw\" (UniqueName: \"kubernetes.io/projected/3539113f-fe53-40a0-a08c-d7f86951d067-kube-api-access-dmvjw\") pod \"machine-config-daemon-stl56\" (UID: \"3539113f-fe53-40a0-a08c-d7f86951d067\") " pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.914128 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-var-lib-cni-multus\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.914165 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-run-multus-certs\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.914190 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/307c371c-8938-4d8c-826c-a682302f1003-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.914192 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-hostroot\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.914235 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-cnibin\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.914257 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-host-run-k8s-cni-cncf-io\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.914283 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/307c371c-8938-4d8c-826c-a682302f1003-os-release\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.914372 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3539113f-fe53-40a0-a08c-d7f86951d067-rootfs\") pod \"machine-config-daemon-stl56\" (UID: \"3539113f-fe53-40a0-a08c-d7f86951d067\") " pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.914397 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-system-cni-dir\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.914428 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-multus-cni-dir\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.914576 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/307c371c-8938-4d8c-826c-a682302f1003-cnibin\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.914594 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe4cd320-31b6-43af-a080-c8b4855a1a79-etc-kubernetes\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.914818 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3539113f-fe53-40a0-a08c-d7f86951d067-mcd-auth-proxy-config\") pod \"machine-config-daemon-stl56\" (UID: \"3539113f-fe53-40a0-a08c-d7f86951d067\") " pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.915228 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fe4cd320-31b6-43af-a080-c8b4855a1a79-multus-daemon-config\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.917881 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3539113f-fe53-40a0-a08c-d7f86951d067-proxy-tls\") pod \"machine-config-daemon-stl56\" (UID: \"3539113f-fe53-40a0-a08c-d7f86951d067\") " pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.929285 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7x6m\" (UniqueName: \"kubernetes.io/projected/fe4cd320-31b6-43af-a080-c8b4855a1a79-kube-api-access-s7x6m\") pod \"multus-mng44\" (UID: \"fe4cd320-31b6-43af-a080-c8b4855a1a79\") " pod="openshift-multus/multus-mng44" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.930182 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmvjw\" (UniqueName: \"kubernetes.io/projected/3539113f-fe53-40a0-a08c-d7f86951d067-kube-api-access-dmvjw\") pod \"machine-config-daemon-stl56\" (UID: \"3539113f-fe53-40a0-a08c-d7f86951d067\") " pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.931145 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-722rr\" (UniqueName: \"kubernetes.io/projected/307c371c-8938-4d8c-826c-a682302f1003-kube-api-access-722rr\") pod \"multus-additional-cni-plugins-hmlx4\" (UID: \"307c371c-8938-4d8c-826c-a682302f1003\") " pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.934364 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.951363 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.966098 4776 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 28 06:50:49 crc kubenswrapper[4776]: I0128 06:50:49.997811 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.007895 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" Jan 28 06:50:50 crc kubenswrapper[4776]: W0128 06:50:50.014029 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3539113f_fe53_40a0_a08c_d7f86951d067.slice/crio-fb28a074b3d9270a646a494da1cb198bd724a01791df51d7fc4f3bd4f095134d WatchSource:0}: Error finding container fb28a074b3d9270a646a494da1cb198bd724a01791df51d7fc4f3bd4f095134d: Status 404 returned error can't find the container with id fb28a074b3d9270a646a494da1cb198bd724a01791df51d7fc4f3bd4f095134d Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.014180 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.019169 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mng44" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.040717 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hf24q"] Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.041460 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.044301 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.044475 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.044519 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.045540 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.045688 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.045892 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.046007 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.056378 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.065207 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.076177 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.086766 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.099333 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.109530 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115538 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-node-log\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115618 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-cni-bin\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115645 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-var-lib-openvswitch\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115674 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-openvswitch\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115697 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-run-ovn-kubernetes\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115717 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-ovnkube-script-lib\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115734 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-kubelet\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115760 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-slash\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115778 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115799 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/852d93f4-af9e-413f-8d64-c013edc14dc6-ovn-node-metrics-cert\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115818 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-systemd\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115851 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-log-socket\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115867 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-ovn\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115883 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-ovnkube-config\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115904 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-etc-openvswitch\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115918 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df4q8\" (UniqueName: \"kubernetes.io/projected/852d93f4-af9e-413f-8d64-c013edc14dc6-kube-api-access-df4q8\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115933 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-cni-netd\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115947 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-systemd-units\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115962 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-run-netns\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.115977 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-env-overrides\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.118580 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.133491 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"852d93f4-af9e-413f-8d64-c013edc14dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hf24q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.144136 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.154202 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.166909 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.203018 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217359 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-node-log\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217406 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-cni-bin\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217432 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-var-lib-openvswitch\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217505 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-openvswitch\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217530 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-run-ovn-kubernetes\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217535 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-var-lib-openvswitch\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217572 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-ovnkube-script-lib\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217574 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-cni-bin\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217598 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-kubelet\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217608 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-run-ovn-kubernetes\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217503 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-node-log\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217633 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-kubelet\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217665 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-openvswitch\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217691 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-slash\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217791 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-slash\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217825 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217858 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/852d93f4-af9e-413f-8d64-c013edc14dc6-ovn-node-metrics-cert\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217916 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-systemd\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217940 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-log-socket\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217958 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.217973 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-ovn\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.218005 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-systemd\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.218009 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-ovnkube-config\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.218066 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-etc-openvswitch\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.218095 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df4q8\" (UniqueName: \"kubernetes.io/projected/852d93f4-af9e-413f-8d64-c013edc14dc6-kube-api-access-df4q8\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.218120 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-cni-netd\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.218139 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-run-netns\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.218160 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-env-overrides\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.218182 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-systemd-units\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.218230 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-systemd-units\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.218306 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-ovnkube-script-lib\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.218356 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-log-socket\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.218384 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-ovn\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.218408 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-cni-netd\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.218429 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-etc-openvswitch\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.218543 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-run-netns\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.218619 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-ovnkube-config\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.218896 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-env-overrides\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.221974 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/852d93f4-af9e-413f-8d64-c013edc14dc6-ovn-node-metrics-cert\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.227027 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.243029 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.258147 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 00:11:44.563809518 +0000 UTC Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.277603 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df4q8\" (UniqueName: \"kubernetes.io/projected/852d93f4-af9e-413f-8d64-c013edc14dc6-kube-api-access-df4q8\") pod \"ovnkube-node-hf24q\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.282872 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.304420 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.304508 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:50:50 crc kubenswrapper[4776]: E0128 06:50:50.304585 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 06:50:50 crc kubenswrapper[4776]: E0128 06:50:50.304696 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.372252 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.419777 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.419882 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:50:50 crc kubenswrapper[4776]: E0128 06:50:50.419961 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:50:52.419932811 +0000 UTC m=+23.835592971 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:50:50 crc kubenswrapper[4776]: E0128 06:50:50.419987 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.420016 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:50:50 crc kubenswrapper[4776]: E0128 06:50:50.420045 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 06:50:52.420029654 +0000 UTC m=+23.835689874 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 06:50:50 crc kubenswrapper[4776]: E0128 06:50:50.420156 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 06:50:50 crc kubenswrapper[4776]: E0128 06:50:50.420211 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 06:50:52.420195488 +0000 UTC m=+23.835855738 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.426243 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.437206 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.438911 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mng44" event={"ID":"fe4cd320-31b6-43af-a080-c8b4855a1a79","Type":"ContainerStarted","Data":"400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d"} Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.438951 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mng44" event={"ID":"fe4cd320-31b6-43af-a080-c8b4855a1a79","Type":"ContainerStarted","Data":"a0eb401fabd0465e17f28dc5306720940e17e9d37cd1e2e4381bd143d45aaa07"} Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.439812 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" event={"ID":"307c371c-8938-4d8c-826c-a682302f1003","Type":"ContainerStarted","Data":"55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b"} Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.439858 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" event={"ID":"307c371c-8938-4d8c-826c-a682302f1003","Type":"ContainerStarted","Data":"d2c6053d0e94aee289aa8633807c793874f31961eded4294451e1ccff37b1891"} Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.441044 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"5c4ce9c8f86547ec08e76b84fa283a7c2bc7ba91730dc198a577c6386e41b270"} Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.441071 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c"} Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.441083 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"fb28a074b3d9270a646a494da1cb198bd724a01791df51d7fc4f3bd4f095134d"} Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.481096 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.524706 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.524917 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:50:50 crc kubenswrapper[4776]: E0128 06:50:50.525409 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 06:50:50 crc kubenswrapper[4776]: E0128 06:50:50.525442 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 06:50:50 crc kubenswrapper[4776]: E0128 06:50:50.525456 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:50 crc kubenswrapper[4776]: E0128 06:50:50.525524 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 06:50:52.525485065 +0000 UTC m=+23.941145295 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:50 crc kubenswrapper[4776]: E0128 06:50:50.526084 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 06:50:50 crc kubenswrapper[4776]: E0128 06:50:50.526108 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 06:50:50 crc kubenswrapper[4776]: E0128 06:50:50.526119 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:50 crc kubenswrapper[4776]: E0128 06:50:50.526148 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 06:50:52.526139281 +0000 UTC m=+23.941799481 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.526265 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.526663 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.535576 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.546473 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.554377 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.571923 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"852d93f4-af9e-413f-8d64-c013edc14dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hf24q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.599498 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.603296 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.609757 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.634380 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.669113 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.683388 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.724317 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.748078 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.787205 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.829167 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.868310 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.908953 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.949088 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fd1ba0-c538-4040-8f48-7e73df015a37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f869d451c9feadf03362ecb0200659be0d5f5e238d4cbd12b5b4d25400e5478e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ce99972c10bcfd218f38551ccf6e3b1e3f4a29eda068c35edbc88cb7cb6226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4651ce325efa3d8f0e7c1191f4628441eb82f3077fb1f8b66ea4ee0d7a30a020\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:50 crc kubenswrapper[4776]: I0128 06:50:50.988940 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.028458 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.069667 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.107932 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.147778 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.187683 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.227319 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.259145 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 07:26:35.714204664 +0000 UTC Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.267858 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.304270 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:50:51 crc kubenswrapper[4776]: E0128 06:50:51.304394 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.309411 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.355084 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.387480 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c4ce9c8f86547ec08e76b84fa283a7c2bc7ba91730dc198a577c6386e41b270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.433149 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"852d93f4-af9e-413f-8d64-c013edc14dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hf24q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.444917 4776 generic.go:334] "Generic (PLEG): container finished" podID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerID="f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27" exitCode=0 Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.445011 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerDied","Data":"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27"} Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.445076 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerStarted","Data":"afe2602e536a86aeee2007d0b0a5a8180fc1ebf1536c2ec351181e0b5588e77f"} Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.446490 4776 generic.go:334] "Generic (PLEG): container finished" podID="307c371c-8938-4d8c-826c-a682302f1003" containerID="55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b" exitCode=0 Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.446580 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" event={"ID":"307c371c-8938-4d8c-826c-a682302f1003","Type":"ContainerDied","Data":"55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b"} Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.476430 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.508447 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.548278 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.591229 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.595893 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-sqkt7"] Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.596297 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sqkt7" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.622747 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.642696 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.662430 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.683337 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.709886 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.738075 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7-serviceca\") pod \"node-ca-sqkt7\" (UID: \"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\") " pod="openshift-image-registry/node-ca-sqkt7" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.738117 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7-host\") pod \"node-ca-sqkt7\" (UID: \"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\") " pod="openshift-image-registry/node-ca-sqkt7" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.738140 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq559\" (UniqueName: \"kubernetes.io/projected/d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7-kube-api-access-nq559\") pod \"node-ca-sqkt7\" (UID: \"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\") " pod="openshift-image-registry/node-ca-sqkt7" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.748638 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.788751 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c4ce9c8f86547ec08e76b84fa283a7c2bc7ba91730dc198a577c6386e41b270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.833379 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"852d93f4-af9e-413f-8d64-c013edc14dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hf24q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.839294 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7-host\") pod \"node-ca-sqkt7\" (UID: \"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\") " pod="openshift-image-registry/node-ca-sqkt7" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.839358 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq559\" (UniqueName: \"kubernetes.io/projected/d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7-kube-api-access-nq559\") pod \"node-ca-sqkt7\" (UID: \"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\") " pod="openshift-image-registry/node-ca-sqkt7" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.839431 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7-serviceca\") pod \"node-ca-sqkt7\" (UID: \"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\") " pod="openshift-image-registry/node-ca-sqkt7" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.839490 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7-host\") pod \"node-ca-sqkt7\" (UID: \"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\") " pod="openshift-image-registry/node-ca-sqkt7" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.840429 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7-serviceca\") pod \"node-ca-sqkt7\" (UID: \"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\") " pod="openshift-image-registry/node-ca-sqkt7" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.886596 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq559\" (UniqueName: \"kubernetes.io/projected/d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7-kube-api-access-nq559\") pod \"node-ca-sqkt7\" (UID: \"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\") " pod="openshift-image-registry/node-ca-sqkt7" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.889280 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fd1ba0-c538-4040-8f48-7e73df015a37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f869d451c9feadf03362ecb0200659be0d5f5e238d4cbd12b5b4d25400e5478e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ce99972c10bcfd218f38551ccf6e3b1e3f4a29eda068c35edbc88cb7cb6226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4651ce325efa3d8f0e7c1191f4628441eb82f3077fb1f8b66ea4ee0d7a30a020\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.928769 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.953371 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sqkt7" Jan 28 06:50:51 crc kubenswrapper[4776]: W0128 06:50:51.966027 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd340a5a9_2dbb_4a21_b9a4_c9b99ffb4ca7.slice/crio-0c0de9490b477737b94222701459835eb569b426c78c1450930d0b462d58dcda WatchSource:0}: Error finding container 0c0de9490b477737b94222701459835eb569b426c78c1450930d0b462d58dcda: Status 404 returned error can't find the container with id 0c0de9490b477737b94222701459835eb569b426c78c1450930d0b462d58dcda Jan 28 06:50:51 crc kubenswrapper[4776]: I0128 06:50:51.968330 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.015785 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.050029 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.086888 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c4ce9c8f86547ec08e76b84fa283a7c2bc7ba91730dc198a577c6386e41b270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.131974 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"852d93f4-af9e-413f-8d64-c013edc14dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hf24q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.167442 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqkt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq559\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqkt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.210988 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.249571 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.259430 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 21:38:49.350084686 +0000 UTC Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.289125 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.304565 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.304589 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:50:52 crc kubenswrapper[4776]: E0128 06:50:52.304698 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 06:50:52 crc kubenswrapper[4776]: E0128 06:50:52.304766 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.333067 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.366764 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.409500 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.446412 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.446621 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.446657 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:50:52 crc kubenswrapper[4776]: E0128 06:50:52.446758 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 06:50:52 crc kubenswrapper[4776]: E0128 06:50:52.446794 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:50:56.446761312 +0000 UTC m=+27.862421482 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:50:52 crc kubenswrapper[4776]: E0128 06:50:52.446836 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 06:50:56.446823454 +0000 UTC m=+27.862483624 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 06:50:52 crc kubenswrapper[4776]: E0128 06:50:52.446868 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 06:50:52 crc kubenswrapper[4776]: E0128 06:50:52.446943 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 06:50:56.446926086 +0000 UTC m=+27.862586256 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.451662 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.452206 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sqkt7" event={"ID":"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7","Type":"ContainerStarted","Data":"7f9f78d341a4494702ccc46ca89383e0015687373aac6f10dd095dd144709fe9"} Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.452266 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sqkt7" event={"ID":"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7","Type":"ContainerStarted","Data":"0c0de9490b477737b94222701459835eb569b426c78c1450930d0b462d58dcda"} Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.457791 4776 generic.go:334] "Generic (PLEG): container finished" podID="307c371c-8938-4d8c-826c-a682302f1003" containerID="cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b" exitCode=0 Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.457918 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" event={"ID":"307c371c-8938-4d8c-826c-a682302f1003","Type":"ContainerDied","Data":"cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b"} Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.464381 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerStarted","Data":"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b"} Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.464419 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerStarted","Data":"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b"} Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.464429 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerStarted","Data":"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536"} Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.464441 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerStarted","Data":"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07"} Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.464450 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerStarted","Data":"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746"} Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.464461 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerStarted","Data":"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808"} Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.488378 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fd1ba0-c538-4040-8f48-7e73df015a37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f869d451c9feadf03362ecb0200659be0d5f5e238d4cbd12b5b4d25400e5478e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ce99972c10bcfd218f38551ccf6e3b1e3f4a29eda068c35edbc88cb7cb6226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4651ce325efa3d8f0e7c1191f4628441eb82f3077fb1f8b66ea4ee0d7a30a020\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.528008 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.547449 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.547598 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:50:52 crc kubenswrapper[4776]: E0128 06:50:52.547698 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 06:50:52 crc kubenswrapper[4776]: E0128 06:50:52.547715 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 06:50:52 crc kubenswrapper[4776]: E0128 06:50:52.547727 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:52 crc kubenswrapper[4776]: E0128 06:50:52.547735 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 06:50:52 crc kubenswrapper[4776]: E0128 06:50:52.547768 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 06:50:56.547753134 +0000 UTC m=+27.963413294 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:52 crc kubenswrapper[4776]: E0128 06:50:52.547772 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 06:50:52 crc kubenswrapper[4776]: E0128 06:50:52.547792 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:52 crc kubenswrapper[4776]: E0128 06:50:52.547864 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 06:50:56.547843386 +0000 UTC m=+27.963503636 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.570044 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.609383 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.651152 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.693087 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.732635 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.769521 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c4ce9c8f86547ec08e76b84fa283a7c2bc7ba91730dc198a577c6386e41b270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.835320 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"852d93f4-af9e-413f-8d64-c013edc14dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hf24q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.853684 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqkt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9f78d341a4494702ccc46ca89383e0015687373aac6f10dd095dd144709fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq559\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqkt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.890506 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fd1ba0-c538-4040-8f48-7e73df015a37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f869d451c9feadf03362ecb0200659be0d5f5e238d4cbd12b5b4d25400e5478e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ce99972c10bcfd218f38551ccf6e3b1e3f4a29eda068c35edbc88cb7cb6226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4651ce325efa3d8f0e7c1191f4628441eb82f3077fb1f8b66ea4ee0d7a30a020\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.928306 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:52 crc kubenswrapper[4776]: I0128 06:50:52.967738 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.044932 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.055362 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.088181 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.127505 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.170761 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.260563 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 15:24:02.953361312 +0000 UTC Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.304432 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:50:53 crc kubenswrapper[4776]: E0128 06:50:53.304649 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.470241 4776 generic.go:334] "Generic (PLEG): container finished" podID="307c371c-8938-4d8c-826c-a682302f1003" containerID="0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1" exitCode=0 Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.470286 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" event={"ID":"307c371c-8938-4d8c-826c-a682302f1003","Type":"ContainerDied","Data":"0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1"} Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.488524 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.498977 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.508361 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.527421 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"852d93f4-af9e-413f-8d64-c013edc14dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hf24q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.535746 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqkt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9f78d341a4494702ccc46ca89383e0015687373aac6f10dd095dd144709fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq559\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqkt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.548258 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.557496 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.566052 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.573705 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c4ce9c8f86547ec08e76b84fa283a7c2bc7ba91730dc198a577c6386e41b270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.579692 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.611848 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.648811 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.692431 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fd1ba0-c538-4040-8f48-7e73df015a37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f869d451c9feadf03362ecb0200659be0d5f5e238d4cbd12b5b4d25400e5478e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ce99972c10bcfd218f38551ccf6e3b1e3f4a29eda068c35edbc88cb7cb6226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4651ce325efa3d8f0e7c1191f4628441eb82f3077fb1f8b66ea4ee0d7a30a020\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:53 crc kubenswrapper[4776]: I0128 06:50:53.728319 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.261532 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 19:34:11.904835363 +0000 UTC Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.304478 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.304578 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:50:54 crc kubenswrapper[4776]: E0128 06:50:54.304697 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 06:50:54 crc kubenswrapper[4776]: E0128 06:50:54.304793 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.482818 4776 generic.go:334] "Generic (PLEG): container finished" podID="307c371c-8938-4d8c-826c-a682302f1003" containerID="56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b" exitCode=0 Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.482944 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" event={"ID":"307c371c-8938-4d8c-826c-a682302f1003","Type":"ContainerDied","Data":"56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b"} Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.489710 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerStarted","Data":"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede"} Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.500425 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.512579 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.522271 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.537747 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.552775 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.569626 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.582734 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c4ce9c8f86547ec08e76b84fa283a7c2bc7ba91730dc198a577c6386e41b270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.610656 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"852d93f4-af9e-413f-8d64-c013edc14dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hf24q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.620214 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqkt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9f78d341a4494702ccc46ca89383e0015687373aac6f10dd095dd144709fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq559\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqkt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.643349 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fd1ba0-c538-4040-8f48-7e73df015a37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f869d451c9feadf03362ecb0200659be0d5f5e238d4cbd12b5b4d25400e5478e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ce99972c10bcfd218f38551ccf6e3b1e3f4a29eda068c35edbc88cb7cb6226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4651ce325efa3d8f0e7c1191f4628441eb82f3077fb1f8b66ea4ee0d7a30a020\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.657703 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.669052 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.682693 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:54 crc kubenswrapper[4776]: I0128 06:50:54.695914 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.160086 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.163634 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.163673 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.163686 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.163786 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.169661 4776 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.169918 4776 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.171098 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.171138 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.171151 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.171167 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.171177 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:55Z","lastTransitionTime":"2026-01-28T06:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:55 crc kubenswrapper[4776]: E0128 06:50:55.180293 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32f27aa7-2aa2-417b-80de-993c2f103850\\\",\\\"systemUUID\\\":\\\"53a286a7-147d-439f-bf29-b3b09993325f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.183095 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.183123 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.183133 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.183150 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.183162 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:55Z","lastTransitionTime":"2026-01-28T06:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.187735 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.188677 4776 scope.go:117] "RemoveContainer" containerID="84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17" Jan 28 06:50:55 crc kubenswrapper[4776]: E0128 06:50:55.188877 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 28 06:50:55 crc kubenswrapper[4776]: E0128 06:50:55.190818 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32f27aa7-2aa2-417b-80de-993c2f103850\\\",\\\"systemUUID\\\":\\\"53a286a7-147d-439f-bf29-b3b09993325f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.193921 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.193949 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.193957 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.193969 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.193980 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:55Z","lastTransitionTime":"2026-01-28T06:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:55 crc kubenswrapper[4776]: E0128 06:50:55.201879 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32f27aa7-2aa2-417b-80de-993c2f103850\\\",\\\"systemUUID\\\":\\\"53a286a7-147d-439f-bf29-b3b09993325f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.205129 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.205169 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.205178 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.205192 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.205203 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:55Z","lastTransitionTime":"2026-01-28T06:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:55 crc kubenswrapper[4776]: E0128 06:50:55.214466 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32f27aa7-2aa2-417b-80de-993c2f103850\\\",\\\"systemUUID\\\":\\\"53a286a7-147d-439f-bf29-b3b09993325f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.218694 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.218764 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.218782 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.218806 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.218820 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:55Z","lastTransitionTime":"2026-01-28T06:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:55 crc kubenswrapper[4776]: E0128 06:50:55.227191 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32f27aa7-2aa2-417b-80de-993c2f103850\\\",\\\"systemUUID\\\":\\\"53a286a7-147d-439f-bf29-b3b09993325f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: E0128 06:50:55.227336 4776 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.228943 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.228972 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.228981 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.229000 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.229010 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:55Z","lastTransitionTime":"2026-01-28T06:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.264655 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 08:48:58.662607667 +0000 UTC Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.304204 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:50:55 crc kubenswrapper[4776]: E0128 06:50:55.304374 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.331101 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.331146 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.331159 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.331179 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.331189 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:55Z","lastTransitionTime":"2026-01-28T06:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.433752 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.433808 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.433828 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.433852 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.433868 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:55Z","lastTransitionTime":"2026-01-28T06:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.496918 4776 generic.go:334] "Generic (PLEG): container finished" podID="307c371c-8938-4d8c-826c-a682302f1003" containerID="cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91" exitCode=0 Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.496967 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" event={"ID":"307c371c-8938-4d8c-826c-a682302f1003","Type":"ContainerDied","Data":"cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91"} Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.509187 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c4ce9c8f86547ec08e76b84fa283a7c2bc7ba91730dc198a577c6386e41b270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.530386 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"852d93f4-af9e-413f-8d64-c013edc14dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hf24q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.536384 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.536415 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.536422 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.536437 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.536447 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:55Z","lastTransitionTime":"2026-01-28T06:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.538620 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqkt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9f78d341a4494702ccc46ca89383e0015687373aac6f10dd095dd144709fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq559\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqkt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.552596 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.562376 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.572307 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.580517 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.588454 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.599530 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.608882 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.616892 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fd1ba0-c538-4040-8f48-7e73df015a37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f869d451c9feadf03362ecb0200659be0d5f5e238d4cbd12b5b4d25400e5478e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ce99972c10bcfd218f38551ccf6e3b1e3f4a29eda068c35edbc88cb7cb6226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4651ce325efa3d8f0e7c1191f4628441eb82f3077fb1f8b66ea4ee0d7a30a020\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.626432 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.636504 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.639002 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.639029 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.639040 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.639056 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.639066 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:55Z","lastTransitionTime":"2026-01-28T06:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.644953 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.741375 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.741708 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.741719 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.741735 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.741745 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:55Z","lastTransitionTime":"2026-01-28T06:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.843732 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.843771 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.843780 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.843795 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.843805 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:55Z","lastTransitionTime":"2026-01-28T06:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.946133 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.946160 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.946168 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.946183 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:55 crc kubenswrapper[4776]: I0128 06:50:55.946192 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:55Z","lastTransitionTime":"2026-01-28T06:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.048268 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.048338 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.048356 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.048383 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.048404 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:56Z","lastTransitionTime":"2026-01-28T06:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.150880 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.150919 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.150929 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.150944 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.150953 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:56Z","lastTransitionTime":"2026-01-28T06:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.253388 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.253435 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.253449 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.253468 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.253483 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:56Z","lastTransitionTime":"2026-01-28T06:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.264977 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 15:26:01.073536063 +0000 UTC Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.304589 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.304644 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:50:56 crc kubenswrapper[4776]: E0128 06:50:56.304768 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 06:50:56 crc kubenswrapper[4776]: E0128 06:50:56.304887 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.355314 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.355351 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.355362 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.355382 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.355394 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:56Z","lastTransitionTime":"2026-01-28T06:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.457925 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.457972 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.457983 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.457999 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.458009 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:56Z","lastTransitionTime":"2026-01-28T06:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.491250 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.491428 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:50:56 crc kubenswrapper[4776]: E0128 06:50:56.491459 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:04.491435324 +0000 UTC m=+35.907095494 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.491520 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:50:56 crc kubenswrapper[4776]: E0128 06:50:56.491529 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 06:50:56 crc kubenswrapper[4776]: E0128 06:50:56.491619 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 06:51:04.491609588 +0000 UTC m=+35.907269758 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 06:50:56 crc kubenswrapper[4776]: E0128 06:50:56.491630 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 06:50:56 crc kubenswrapper[4776]: E0128 06:50:56.491667 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 06:51:04.491655559 +0000 UTC m=+35.907315719 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.502720 4776 generic.go:334] "Generic (PLEG): container finished" podID="307c371c-8938-4d8c-826c-a682302f1003" containerID="33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604" exitCode=0 Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.502760 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" event={"ID":"307c371c-8938-4d8c-826c-a682302f1003","Type":"ContainerDied","Data":"33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604"} Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.513715 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.562296 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.562339 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.562349 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.562365 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.562374 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:56Z","lastTransitionTime":"2026-01-28T06:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.564158 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.575179 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.585760 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.592244 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.592368 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:50:56 crc kubenswrapper[4776]: E0128 06:50:56.592516 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 06:50:56 crc kubenswrapper[4776]: E0128 06:50:56.592540 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 06:50:56 crc kubenswrapper[4776]: E0128 06:50:56.592632 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:56 crc kubenswrapper[4776]: E0128 06:50:56.592680 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 06:51:04.592663821 +0000 UTC m=+36.008323981 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:56 crc kubenswrapper[4776]: E0128 06:50:56.592747 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 06:50:56 crc kubenswrapper[4776]: E0128 06:50:56.592758 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 06:50:56 crc kubenswrapper[4776]: E0128 06:50:56.592768 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:56 crc kubenswrapper[4776]: E0128 06:50:56.592794 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 06:51:04.592784515 +0000 UTC m=+36.008444675 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.596171 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.604811 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c4ce9c8f86547ec08e76b84fa283a7c2bc7ba91730dc198a577c6386e41b270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.617865 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"852d93f4-af9e-413f-8d64-c013edc14dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hf24q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.624300 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqkt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9f78d341a4494702ccc46ca89383e0015687373aac6f10dd095dd144709fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq559\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqkt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.633826 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.647976 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.658796 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.672229 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.672281 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.672292 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.672311 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.672325 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:56Z","lastTransitionTime":"2026-01-28T06:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.673001 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.682650 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.691317 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fd1ba0-c538-4040-8f48-7e73df015a37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f869d451c9feadf03362ecb0200659be0d5f5e238d4cbd12b5b4d25400e5478e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ce99972c10bcfd218f38551ccf6e3b1e3f4a29eda068c35edbc88cb7cb6226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4651ce325efa3d8f0e7c1191f4628441eb82f3077fb1f8b66ea4ee0d7a30a020\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.774711 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.774740 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.774748 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.774761 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.774770 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:56Z","lastTransitionTime":"2026-01-28T06:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.780527 4776 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.876882 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.876915 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.876927 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.876943 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.876955 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:56Z","lastTransitionTime":"2026-01-28T06:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.980016 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.980058 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.980066 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.980083 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:56 crc kubenswrapper[4776]: I0128 06:50:56.980093 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:56Z","lastTransitionTime":"2026-01-28T06:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.082791 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.082866 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.082875 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.082888 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.082900 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:57Z","lastTransitionTime":"2026-01-28T06:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.185653 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.185701 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.185713 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.185730 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.185741 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:57Z","lastTransitionTime":"2026-01-28T06:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.265167 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 01:53:21.430469425 +0000 UTC Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.289370 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.289736 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.289753 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.289775 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.289791 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:57Z","lastTransitionTime":"2026-01-28T06:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.303792 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:50:57 crc kubenswrapper[4776]: E0128 06:50:57.303995 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.392674 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.392775 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.392806 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.392838 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.392863 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:57Z","lastTransitionTime":"2026-01-28T06:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.495828 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.495877 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.495890 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.495906 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.495918 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:57Z","lastTransitionTime":"2026-01-28T06:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.510842 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerStarted","Data":"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05"} Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.510944 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.510978 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.518317 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" event={"ID":"307c371c-8938-4d8c-826c-a682302f1003","Type":"ContainerStarted","Data":"2a49d1023775b8affa801eac60dfb2151b3b82ba077ddd5371f53ff419b63e5a"} Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.526801 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.538236 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.549232 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.549614 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.555450 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.567014 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c4ce9c8f86547ec08e76b84fa283a7c2bc7ba91730dc198a577c6386e41b270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.591923 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"852d93f4-af9e-413f-8d64-c013edc14dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hf24q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.598753 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.598803 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.598828 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.598866 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.598892 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:57Z","lastTransitionTime":"2026-01-28T06:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.601970 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqkt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9f78d341a4494702ccc46ca89383e0015687373aac6f10dd095dd144709fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq559\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqkt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.619275 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fd1ba0-c538-4040-8f48-7e73df015a37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f869d451c9feadf03362ecb0200659be0d5f5e238d4cbd12b5b4d25400e5478e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ce99972c10bcfd218f38551ccf6e3b1e3f4a29eda068c35edbc88cb7cb6226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4651ce325efa3d8f0e7c1191f4628441eb82f3077fb1f8b66ea4ee0d7a30a020\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.632309 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.641166 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.654341 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.667098 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.677663 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.694099 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.701438 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.701500 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.701518 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.701543 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.701597 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:57Z","lastTransitionTime":"2026-01-28T06:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.711984 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.726839 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.739087 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.755263 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.767505 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c4ce9c8f86547ec08e76b84fa283a7c2bc7ba91730dc198a577c6386e41b270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.783949 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"852d93f4-af9e-413f-8d64-c013edc14dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hf24q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.791593 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqkt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9f78d341a4494702ccc46ca89383e0015687373aac6f10dd095dd144709fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq559\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqkt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.804479 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.804707 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.804811 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.804914 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.804996 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:57Z","lastTransitionTime":"2026-01-28T06:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.809632 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fd1ba0-c538-4040-8f48-7e73df015a37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f869d451c9feadf03362ecb0200659be0d5f5e238d4cbd12b5b4d25400e5478e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ce99972c10bcfd218f38551ccf6e3b1e3f4a29eda068c35edbc88cb7cb6226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4651ce325efa3d8f0e7c1191f4628441eb82f3077fb1f8b66ea4ee0d7a30a020\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.818887 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.828311 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.841117 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a49d1023775b8affa801eac60dfb2151b3b82ba077ddd5371f53ff419b63e5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.852460 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.862465 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.874195 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.885836 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.907339 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.907394 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.907413 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.907440 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:57 crc kubenswrapper[4776]: I0128 06:50:57.907459 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:57Z","lastTransitionTime":"2026-01-28T06:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.009404 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.009456 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.009468 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.009488 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.009502 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:58Z","lastTransitionTime":"2026-01-28T06:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.112018 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.112359 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.112581 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.113045 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.113277 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:58Z","lastTransitionTime":"2026-01-28T06:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.216642 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.216686 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.216697 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.216716 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.216730 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:58Z","lastTransitionTime":"2026-01-28T06:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.265939 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 00:07:42.03867502 +0000 UTC Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.303710 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.303783 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:50:58 crc kubenswrapper[4776]: E0128 06:50:58.303973 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 06:50:58 crc kubenswrapper[4776]: E0128 06:50:58.304113 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.318586 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.318640 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.318657 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.318680 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.318699 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:58Z","lastTransitionTime":"2026-01-28T06:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.421967 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.422030 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.422047 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.422070 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.422087 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:58Z","lastTransitionTime":"2026-01-28T06:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.521234 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.524167 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.524217 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.524232 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.524253 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.524268 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:58Z","lastTransitionTime":"2026-01-28T06:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.626783 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.626850 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.626878 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.626910 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.626932 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:58Z","lastTransitionTime":"2026-01-28T06:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.729194 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.729234 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.729245 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.729265 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.729278 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:58Z","lastTransitionTime":"2026-01-28T06:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.831951 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.832001 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.832018 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.832043 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.832058 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:58Z","lastTransitionTime":"2026-01-28T06:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.854648 4776 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.934367 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.934406 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.934416 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.934433 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:58 crc kubenswrapper[4776]: I0128 06:50:58.934444 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:58Z","lastTransitionTime":"2026-01-28T06:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.036409 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.036564 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.036576 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.036593 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.036604 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:59Z","lastTransitionTime":"2026-01-28T06:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.139443 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.139506 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.139524 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.139587 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.139614 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:59Z","lastTransitionTime":"2026-01-28T06:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.242215 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.242251 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.242263 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.242278 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.242289 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:59Z","lastTransitionTime":"2026-01-28T06:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.266249 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 08:24:53.885366192 +0000 UTC Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.304165 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:50:59 crc kubenswrapper[4776]: E0128 06:50:59.304339 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.322887 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.332719 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.341481 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.344230 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.344268 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.344280 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.344297 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.344310 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:59Z","lastTransitionTime":"2026-01-28T06:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.349655 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c4ce9c8f86547ec08e76b84fa283a7c2bc7ba91730dc198a577c6386e41b270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.364076 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"852d93f4-af9e-413f-8d64-c013edc14dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hf24q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.370932 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqkt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9f78d341a4494702ccc46ca89383e0015687373aac6f10dd095dd144709fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq559\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqkt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.379176 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fd1ba0-c538-4040-8f48-7e73df015a37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f869d451c9feadf03362ecb0200659be0d5f5e238d4cbd12b5b4d25400e5478e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ce99972c10bcfd218f38551ccf6e3b1e3f4a29eda068c35edbc88cb7cb6226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4651ce325efa3d8f0e7c1191f4628441eb82f3077fb1f8b66ea4ee0d7a30a020\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.389060 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.394916 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.404814 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a49d1023775b8affa801eac60dfb2151b3b82ba077ddd5371f53ff419b63e5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.413657 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.421746 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.432761 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.442797 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.446516 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.446582 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.446599 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.446620 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.446635 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:59Z","lastTransitionTime":"2026-01-28T06:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.524307 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.549050 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.549093 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.549111 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.549130 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.549143 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:59Z","lastTransitionTime":"2026-01-28T06:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.650673 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.650712 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.650720 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.650735 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.650745 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:59Z","lastTransitionTime":"2026-01-28T06:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.753277 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.753319 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.753331 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.753349 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.753361 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:59Z","lastTransitionTime":"2026-01-28T06:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.855700 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.855733 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.855741 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.855755 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.855763 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:59Z","lastTransitionTime":"2026-01-28T06:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.958169 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.958212 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.958222 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.958238 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:50:59 crc kubenswrapper[4776]: I0128 06:50:59.958248 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:50:59Z","lastTransitionTime":"2026-01-28T06:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.061006 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.061072 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.061083 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.061100 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.061111 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:00Z","lastTransitionTime":"2026-01-28T06:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.164024 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.164059 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.164067 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.164082 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.164091 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:00Z","lastTransitionTime":"2026-01-28T06:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.266213 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.266266 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.266276 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.266290 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.266298 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:00Z","lastTransitionTime":"2026-01-28T06:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.266434 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 21:01:41.339915431 +0000 UTC Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.304152 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.304194 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:51:00 crc kubenswrapper[4776]: E0128 06:51:00.304276 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 06:51:00 crc kubenswrapper[4776]: E0128 06:51:00.304852 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.368705 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.368749 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.368757 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.368771 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.368781 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:00Z","lastTransitionTime":"2026-01-28T06:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.471170 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.471205 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.471214 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.471230 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.471238 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:00Z","lastTransitionTime":"2026-01-28T06:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.573170 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.573210 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.573223 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.573240 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.573252 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:00Z","lastTransitionTime":"2026-01-28T06:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.676138 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.676191 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.676202 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.676220 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.676234 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:00Z","lastTransitionTime":"2026-01-28T06:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.779122 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.779164 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.779178 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.779220 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.779234 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:00Z","lastTransitionTime":"2026-01-28T06:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.882400 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.882448 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.882461 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.882478 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.882491 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:00Z","lastTransitionTime":"2026-01-28T06:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.987179 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.987236 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.987255 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.987283 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:00 crc kubenswrapper[4776]: I0128 06:51:00.987304 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:00Z","lastTransitionTime":"2026-01-28T06:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.090216 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.090259 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.090269 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.090288 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.090298 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:01Z","lastTransitionTime":"2026-01-28T06:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.193695 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.193775 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.193799 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.193832 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.193856 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:01Z","lastTransitionTime":"2026-01-28T06:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.266935 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 02:01:38.420144079 +0000 UTC Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.297370 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.297404 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.297417 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.297431 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.297443 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:01Z","lastTransitionTime":"2026-01-28T06:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.306009 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:51:01 crc kubenswrapper[4776]: E0128 06:51:01.306358 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.399386 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.399428 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.399440 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.399455 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.399466 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:01Z","lastTransitionTime":"2026-01-28T06:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.502067 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.502107 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.502116 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.502133 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.502147 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:01Z","lastTransitionTime":"2026-01-28T06:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.532470 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"dd2dceb3a199fc513680e23d5c6920959e4dfb0250cf9389f41c4baa2836c259"} Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.550671 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"852d93f4-af9e-413f-8d64-c013edc14dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hf24q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.560864 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqkt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9f78d341a4494702ccc46ca89383e0015687373aac6f10dd095dd144709fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq559\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqkt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.575502 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.586369 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.597605 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd2dceb3a199fc513680e23d5c6920959e4dfb0250cf9389f41c4baa2836c259\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:51:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.608845 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.608886 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.608898 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.608918 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.608930 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:01Z","lastTransitionTime":"2026-01-28T06:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.613854 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c4ce9c8f86547ec08e76b84fa283a7c2bc7ba91730dc198a577c6386e41b270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.621463 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.633371 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a49d1023775b8affa801eac60dfb2151b3b82ba077ddd5371f53ff419b63e5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.642594 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.654619 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fd1ba0-c538-4040-8f48-7e73df015a37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f869d451c9feadf03362ecb0200659be0d5f5e238d4cbd12b5b4d25400e5478e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ce99972c10bcfd218f38551ccf6e3b1e3f4a29eda068c35edbc88cb7cb6226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4651ce325efa3d8f0e7c1191f4628441eb82f3077fb1f8b66ea4ee0d7a30a020\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.665719 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.676882 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.687579 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.697529 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.711722 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.711765 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.711777 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.711794 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.711808 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:01Z","lastTransitionTime":"2026-01-28T06:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.814855 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.814886 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.814897 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.814913 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.814924 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:01Z","lastTransitionTime":"2026-01-28T06:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.917516 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.917579 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.917592 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.917608 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.917620 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:01Z","lastTransitionTime":"2026-01-28T06:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.979154 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t"] Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.980093 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.982777 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 28 06:51:01 crc kubenswrapper[4776]: I0128 06:51:01.983304 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.003212 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.019838 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.019890 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.019904 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.019927 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.019942 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:02Z","lastTransitionTime":"2026-01-28T06:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.022529 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd2dceb3a199fc513680e23d5c6920959e4dfb0250cf9389f41c4baa2836c259\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:51:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.041865 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c4ce9c8f86547ec08e76b84fa283a7c2bc7ba91730dc198a577c6386e41b270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.048410 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91620994-a03e-49f7-aa74-64837a116ac1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6pj9t\" (UID: \"91620994-a03e-49f7-aa74-64837a116ac1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.048503 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91620994-a03e-49f7-aa74-64837a116ac1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6pj9t\" (UID: \"91620994-a03e-49f7-aa74-64837a116ac1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.048591 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf44j\" (UniqueName: \"kubernetes.io/projected/91620994-a03e-49f7-aa74-64837a116ac1-kube-api-access-sf44j\") pod \"ovnkube-control-plane-749d76644c-6pj9t\" (UID: \"91620994-a03e-49f7-aa74-64837a116ac1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.048620 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91620994-a03e-49f7-aa74-64837a116ac1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6pj9t\" (UID: \"91620994-a03e-49f7-aa74-64837a116ac1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.079031 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"852d93f4-af9e-413f-8d64-c013edc14dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hf24q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.089354 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqkt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9f78d341a4494702ccc46ca89383e0015687373aac6f10dd095dd144709fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq559\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqkt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.099688 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.110981 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.117881 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.121913 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.121944 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.121953 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.121967 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.121980 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:02Z","lastTransitionTime":"2026-01-28T06:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.127403 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a49d1023775b8affa801eac60dfb2151b3b82ba077ddd5371f53ff419b63e5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.135595 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.141843 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91620994-a03e-49f7-aa74-64837a116ac1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sf44j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sf44j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6pj9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.149166 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fd1ba0-c538-4040-8f48-7e73df015a37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f869d451c9feadf03362ecb0200659be0d5f5e238d4cbd12b5b4d25400e5478e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ce99972c10bcfd218f38551ccf6e3b1e3f4a29eda068c35edbc88cb7cb6226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4651ce325efa3d8f0e7c1191f4628441eb82f3077fb1f8b66ea4ee0d7a30a020\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.149419 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf44j\" (UniqueName: \"kubernetes.io/projected/91620994-a03e-49f7-aa74-64837a116ac1-kube-api-access-sf44j\") pod \"ovnkube-control-plane-749d76644c-6pj9t\" (UID: \"91620994-a03e-49f7-aa74-64837a116ac1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.149454 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91620994-a03e-49f7-aa74-64837a116ac1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6pj9t\" (UID: \"91620994-a03e-49f7-aa74-64837a116ac1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.149492 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91620994-a03e-49f7-aa74-64837a116ac1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6pj9t\" (UID: \"91620994-a03e-49f7-aa74-64837a116ac1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.149518 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91620994-a03e-49f7-aa74-64837a116ac1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6pj9t\" (UID: \"91620994-a03e-49f7-aa74-64837a116ac1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.150144 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91620994-a03e-49f7-aa74-64837a116ac1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6pj9t\" (UID: \"91620994-a03e-49f7-aa74-64837a116ac1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.150316 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91620994-a03e-49f7-aa74-64837a116ac1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6pj9t\" (UID: \"91620994-a03e-49f7-aa74-64837a116ac1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.154486 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91620994-a03e-49f7-aa74-64837a116ac1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6pj9t\" (UID: \"91620994-a03e-49f7-aa74-64837a116ac1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.159295 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.162626 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf44j\" (UniqueName: \"kubernetes.io/projected/91620994-a03e-49f7-aa74-64837a116ac1-kube-api-access-sf44j\") pod \"ovnkube-control-plane-749d76644c-6pj9t\" (UID: \"91620994-a03e-49f7-aa74-64837a116ac1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.170214 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.179737 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.224757 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.224802 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.224812 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.224830 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.224842 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:02Z","lastTransitionTime":"2026-01-28T06:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.267608 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 02:49:00.268871914 +0000 UTC Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.302861 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.304800 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.304843 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:51:02 crc kubenswrapper[4776]: E0128 06:51:02.304964 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 06:51:02 crc kubenswrapper[4776]: E0128 06:51:02.305088 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 06:51:02 crc kubenswrapper[4776]: W0128 06:51:02.319141 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91620994_a03e_49f7_aa74_64837a116ac1.slice/crio-d1e82b46e1574adf6cf2f1d0d1da259f554784198295672ddc67bbe615815823 WatchSource:0}: Error finding container d1e82b46e1574adf6cf2f1d0d1da259f554784198295672ddc67bbe615815823: Status 404 returned error can't find the container with id d1e82b46e1574adf6cf2f1d0d1da259f554784198295672ddc67bbe615815823 Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.326984 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.327026 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.327037 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.327053 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.327067 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:02Z","lastTransitionTime":"2026-01-28T06:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.429732 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.429777 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.429788 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.429804 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:02 crc kubenswrapper[4776]: I0128 06:51:02.429815 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:02Z","lastTransitionTime":"2026-01-28T06:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.204000 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.204050 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.204061 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.204078 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.204091 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:03Z","lastTransitionTime":"2026-01-28T06:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.207792 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" event={"ID":"91620994-a03e-49f7-aa74-64837a116ac1","Type":"ContainerStarted","Data":"d1e82b46e1574adf6cf2f1d0d1da259f554784198295672ddc67bbe615815823"} Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.267900 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 12:21:54.244689425 +0000 UTC Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.304767 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:51:03 crc kubenswrapper[4776]: E0128 06:51:03.304886 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.306681 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.306719 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.306731 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.306751 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.306767 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:03Z","lastTransitionTime":"2026-01-28T06:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.409283 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.409337 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.409351 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.409371 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.409386 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:03Z","lastTransitionTime":"2026-01-28T06:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.480793 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-d5hf2"] Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.481240 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5hf2" Jan 28 06:51:03 crc kubenswrapper[4776]: E0128 06:51:03.481297 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5hf2" podUID="3819a037-a2a1-433f-884e-84bead904558" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.496043 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"852d93f4-af9e-413f-8d64-c013edc14dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hf24q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.503643 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqkt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9f78d341a4494702ccc46ca89383e0015687373aac6f10dd095dd144709fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq559\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqkt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.511733 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.511779 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.511788 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.511805 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.511815 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:03Z","lastTransitionTime":"2026-01-28T06:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.514050 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d5hf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3819a037-a2a1-433f-884e-84bead904558\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn86p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn86p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:51:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d5hf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.527752 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.538068 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.556630 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd2dceb3a199fc513680e23d5c6920959e4dfb0250cf9389f41c4baa2836c259\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:51:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.565813 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c4ce9c8f86547ec08e76b84fa283a7c2bc7ba91730dc198a577c6386e41b270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.572287 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.584811 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a49d1023775b8affa801eac60dfb2151b3b82ba077ddd5371f53ff419b63e5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.596869 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.599366 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn86p\" (UniqueName: \"kubernetes.io/projected/3819a037-a2a1-433f-884e-84bead904558-kube-api-access-kn86p\") pod \"network-metrics-daemon-d5hf2\" (UID: \"3819a037-a2a1-433f-884e-84bead904558\") " pod="openshift-multus/network-metrics-daemon-d5hf2" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.599438 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3819a037-a2a1-433f-884e-84bead904558-metrics-certs\") pod \"network-metrics-daemon-d5hf2\" (UID: \"3819a037-a2a1-433f-884e-84bead904558\") " pod="openshift-multus/network-metrics-daemon-d5hf2" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.605380 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91620994-a03e-49f7-aa74-64837a116ac1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sf44j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sf44j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6pj9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.613981 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.614064 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.614080 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.614105 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.614122 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:03Z","lastTransitionTime":"2026-01-28T06:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.618862 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fd1ba0-c538-4040-8f48-7e73df015a37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f869d451c9feadf03362ecb0200659be0d5f5e238d4cbd12b5b4d25400e5478e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ce99972c10bcfd218f38551ccf6e3b1e3f4a29eda068c35edbc88cb7cb6226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4651ce325efa3d8f0e7c1191f4628441eb82f3077fb1f8b66ea4ee0d7a30a020\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.626099 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.633076 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.640654 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.649016 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.700852 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3819a037-a2a1-433f-884e-84bead904558-metrics-certs\") pod \"network-metrics-daemon-d5hf2\" (UID: \"3819a037-a2a1-433f-884e-84bead904558\") " pod="openshift-multus/network-metrics-daemon-d5hf2" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.700909 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn86p\" (UniqueName: \"kubernetes.io/projected/3819a037-a2a1-433f-884e-84bead904558-kube-api-access-kn86p\") pod \"network-metrics-daemon-d5hf2\" (UID: \"3819a037-a2a1-433f-884e-84bead904558\") " pod="openshift-multus/network-metrics-daemon-d5hf2" Jan 28 06:51:03 crc kubenswrapper[4776]: E0128 06:51:03.701167 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 06:51:03 crc kubenswrapper[4776]: E0128 06:51:03.701317 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3819a037-a2a1-433f-884e-84bead904558-metrics-certs podName:3819a037-a2a1-433f-884e-84bead904558 nodeName:}" failed. No retries permitted until 2026-01-28 06:51:04.20128543 +0000 UTC m=+35.616945590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3819a037-a2a1-433f-884e-84bead904558-metrics-certs") pod "network-metrics-daemon-d5hf2" (UID: "3819a037-a2a1-433f-884e-84bead904558") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.716279 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.716325 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.716334 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.716352 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.716362 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:03Z","lastTransitionTime":"2026-01-28T06:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.722311 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn86p\" (UniqueName: \"kubernetes.io/projected/3819a037-a2a1-433f-884e-84bead904558-kube-api-access-kn86p\") pod \"network-metrics-daemon-d5hf2\" (UID: \"3819a037-a2a1-433f-884e-84bead904558\") " pod="openshift-multus/network-metrics-daemon-d5hf2" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.819185 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.819220 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.819232 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.819249 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.819264 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:03Z","lastTransitionTime":"2026-01-28T06:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.922275 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.922336 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.922352 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.922375 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:03 crc kubenswrapper[4776]: I0128 06:51:03.922392 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:03Z","lastTransitionTime":"2026-01-28T06:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.026209 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.026285 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.026333 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.026362 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.026386 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:04Z","lastTransitionTime":"2026-01-28T06:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.129940 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.129978 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.129988 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.130004 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.130015 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:04Z","lastTransitionTime":"2026-01-28T06:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.206521 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3819a037-a2a1-433f-884e-84bead904558-metrics-certs\") pod \"network-metrics-daemon-d5hf2\" (UID: \"3819a037-a2a1-433f-884e-84bead904558\") " pod="openshift-multus/network-metrics-daemon-d5hf2" Jan 28 06:51:04 crc kubenswrapper[4776]: E0128 06:51:04.206769 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 06:51:04 crc kubenswrapper[4776]: E0128 06:51:04.206846 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3819a037-a2a1-433f-884e-84bead904558-metrics-certs podName:3819a037-a2a1-433f-884e-84bead904558 nodeName:}" failed. No retries permitted until 2026-01-28 06:51:05.206820624 +0000 UTC m=+36.622480804 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3819a037-a2a1-433f-884e-84bead904558-metrics-certs") pod "network-metrics-daemon-d5hf2" (UID: "3819a037-a2a1-433f-884e-84bead904558") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.214702 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" event={"ID":"91620994-a03e-49f7-aa74-64837a116ac1","Type":"ContainerStarted","Data":"60910e8b17fa67e0976aebaa68b8ceeb5f5b17d322777342042a04f5ef27db98"} Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.214767 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" event={"ID":"91620994-a03e-49f7-aa74-64837a116ac1","Type":"ContainerStarted","Data":"59d317fd06d2f8b98276f4ae3afa4a2720ef8dcfb2bb7b00edcd7be78eb9d6a9"} Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.232941 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.232973 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.232982 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.232998 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.233009 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:04Z","lastTransitionTime":"2026-01-28T06:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.234890 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.244475 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2wlgk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dfc5af-3617-4121-9b26-9593c827a536\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2wlgk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.260125 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"307c371c-8938-4d8c-826c-a682302f1003\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a49d1023775b8affa801eac60dfb2151b3b82ba077ddd5371f53ff419b63e5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55e406adcd9af3147d4bc95d47f32015ce8ab948089d9170075695c2dbf8968b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdcd09d955d19afe47eb3d16a753efd7b6b3a1b3229ba200ada38c7076a75b1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cd9502f80895943fb6913bc2de9be5d7cf6bcaaa59bf99f4f7c8ddc88b167c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b688059b04760e745862ed8a0cfe68731a9a6ce8d795660dfe7312a0473a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdc7dc729e6065ccc2f66fd770cfb9bd53578b262bd6f25cbe5a51fe3aec3d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33fae968a9029307bd20864f236e7b93c6bc9e810d6f86af3c17f69d740b3604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-722rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmlx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.268729 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 12:21:59.961346375 +0000 UTC Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.272760 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mng44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe4cd320-31b6-43af-a080-c8b4855a1a79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7x6m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mng44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.285043 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91620994-a03e-49f7-aa74-64837a116ac1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d317fd06d2f8b98276f4ae3afa4a2720ef8dcfb2bb7b00edcd7be78eb9d6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sf44j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60910e8b17fa67e0976aebaa68b8ceeb5f5b17d322777342042a04f5ef27db98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sf44j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6pj9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.295376 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fd1ba0-c538-4040-8f48-7e73df015a37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f869d451c9feadf03362ecb0200659be0d5f5e238d4cbd12b5b4d25400e5478e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ce99972c10bcfd218f38551ccf6e3b1e3f4a29eda068c35edbc88cb7cb6226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4651ce325efa3d8f0e7c1191f4628441eb82f3077fb1f8b66ea4ee0d7a30a020\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.304027 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.304307 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:51:04 crc kubenswrapper[4776]: E0128 06:51:04.304606 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 06:51:04 crc kubenswrapper[4776]: E0128 06:51:04.304787 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.305263 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.324035 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.337881 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.337938 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.337956 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.337989 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.338007 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:04Z","lastTransitionTime":"2026-01-28T06:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.338240 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.355126 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd2dceb3a199fc513680e23d5c6920959e4dfb0250cf9389f41c4baa2836c259\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:51:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.363991 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3539113f-fe53-40a0-a08c-d7f86951d067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c4ce9c8f86547ec08e76b84fa283a7c2bc7ba91730dc198a577c6386e41b270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmvjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stl56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.383387 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"852d93f4-af9e-413f-8d64-c013edc14dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-df4q8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hf24q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.391321 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqkt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d340a5a9-2dbb-4a21-b9a4-c9b99ffb4ca7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9f78d341a4494702ccc46ca89383e0015687373aac6f10dd095dd144709fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nq559\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqkt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.401512 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d5hf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3819a037-a2a1-433f-884e-84bead904558\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:51:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn86p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn86p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:51:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d5hf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.413874 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d9ff603-fc42-4716-bfad-5dba64a2d188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T06:50:45Z\\\",\\\"message\\\":\\\"file observer\\\\nW0128 06:50:45.082444 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 06:50:45.082642 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 06:50:45.083278 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-553349784/tls.crt::/tmp/serving-cert-553349784/tls.key\\\\\\\"\\\\nI0128 06:50:45.362196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 06:50:45.364991 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 06:50:45.365011 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 06:50:45.365034 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 06:50:45.365039 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 06:50:45.369629 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0128 06:50:45.369659 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0128 06:50:45.369672 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369688 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 06:50:45.369704 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 06:50:45.369714 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 06:50:45.369723 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 06:50:45.369732 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0128 06:50:45.372712 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T06:50:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T06:50:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T06:50:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T06:50:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.424970 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T06:50:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.440499 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.440573 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.440659 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.440686 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.440700 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:04Z","lastTransitionTime":"2026-01-28T06:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.509980 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:04 crc kubenswrapper[4776]: E0128 06:51:04.510164 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:20.510127799 +0000 UTC m=+51.925787979 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.510929 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.510988 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:51:04 crc kubenswrapper[4776]: E0128 06:51:04.511161 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 06:51:04 crc kubenswrapper[4776]: E0128 06:51:04.511230 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 06:51:20.511205395 +0000 UTC m=+51.926865585 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 06:51:04 crc kubenswrapper[4776]: E0128 06:51:04.511336 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 06:51:04 crc kubenswrapper[4776]: E0128 06:51:04.511381 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 06:51:20.511368149 +0000 UTC m=+51.927028349 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.543524 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.543600 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.543609 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.543626 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.543639 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:04Z","lastTransitionTime":"2026-01-28T06:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.612309 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.612574 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:51:04 crc kubenswrapper[4776]: E0128 06:51:04.612714 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 06:51:04 crc kubenswrapper[4776]: E0128 06:51:04.612800 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 06:51:04 crc kubenswrapper[4776]: E0128 06:51:04.612828 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:51:04 crc kubenswrapper[4776]: E0128 06:51:04.612714 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 06:51:04 crc kubenswrapper[4776]: E0128 06:51:04.612879 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 06:51:04 crc kubenswrapper[4776]: E0128 06:51:04.612891 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:51:04 crc kubenswrapper[4776]: E0128 06:51:04.612920 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 06:51:20.612889273 +0000 UTC m=+52.028549663 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:51:04 crc kubenswrapper[4776]: E0128 06:51:04.612942 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 06:51:20.612934694 +0000 UTC m=+52.028595104 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.646956 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.647198 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.647293 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.647370 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.647458 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:04Z","lastTransitionTime":"2026-01-28T06:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.750273 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.750339 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.750353 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.750394 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.750409 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:04Z","lastTransitionTime":"2026-01-28T06:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.853342 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.853381 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.853391 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.853406 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.853415 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:04Z","lastTransitionTime":"2026-01-28T06:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.957190 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.957267 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.957282 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.957309 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:04 crc kubenswrapper[4776]: I0128 06:51:04.957331 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:04Z","lastTransitionTime":"2026-01-28T06:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.061453 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.061539 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.061600 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.061632 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.061657 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:05Z","lastTransitionTime":"2026-01-28T06:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.164937 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.164992 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.165009 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.165032 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.165050 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:05Z","lastTransitionTime":"2026-01-28T06:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.220131 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3819a037-a2a1-433f-884e-84bead904558-metrics-certs\") pod \"network-metrics-daemon-d5hf2\" (UID: \"3819a037-a2a1-433f-884e-84bead904558\") " pod="openshift-multus/network-metrics-daemon-d5hf2" Jan 28 06:51:05 crc kubenswrapper[4776]: E0128 06:51:05.220346 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 06:51:05 crc kubenswrapper[4776]: E0128 06:51:05.220471 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3819a037-a2a1-433f-884e-84bead904558-metrics-certs podName:3819a037-a2a1-433f-884e-84bead904558 nodeName:}" failed. No retries permitted until 2026-01-28 06:51:07.220434084 +0000 UTC m=+38.636094404 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3819a037-a2a1-433f-884e-84bead904558-metrics-certs") pod "network-metrics-daemon-d5hf2" (UID: "3819a037-a2a1-433f-884e-84bead904558") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.221140 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"20ed6bcd3f36e87de1351586fd5da8ebc73cb8bf135344b876839dd99a03f874"} Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.221241 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"86156078e5ed1a417f6c7a6ce2c6e77b0d20fbc51ffd4ca9e624be2bc307cfc9"} Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.223708 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2wlgk" event={"ID":"42dfc5af-3617-4121-9b26-9593c827a536","Type":"ContainerStarted","Data":"e640ba5d8ef7f9c5543bbd3c404255fd0ebb6bf909f1e35adb5497822c190bac"} Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.267871 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.267924 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.267937 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.267960 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.267973 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:05Z","lastTransitionTime":"2026-01-28T06:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.269323 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 20:51:40.962034735 +0000 UTC Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.304434 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.304437 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5hf2" Jan 28 06:51:05 crc kubenswrapper[4776]: E0128 06:51:05.304587 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 06:51:05 crc kubenswrapper[4776]: E0128 06:51:05.304699 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5hf2" podUID="3819a037-a2a1-433f-884e-84bead904558" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.330117 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podStartSLOduration=17.330093918 podStartE2EDuration="17.330093918s" podCreationTimestamp="2026-01-28 06:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:05.329209717 +0000 UTC m=+36.744869887" watchObservedRunningTime="2026-01-28 06:51:05.330093918 +0000 UTC m=+36.745754078" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.370432 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.370463 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.370471 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.370487 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.370497 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:05Z","lastTransitionTime":"2026-01-28T06:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.374007 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" podStartSLOduration=16.373980953 podStartE2EDuration="16.373980953s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:05.360642306 +0000 UTC m=+36.776302466" watchObservedRunningTime="2026-01-28 06:51:05.373980953 +0000 UTC m=+36.789641113" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.374295 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sqkt7" podStartSLOduration=17.37428979 podStartE2EDuration="17.37428979s" podCreationTimestamp="2026-01-28 06:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:05.372916717 +0000 UTC m=+36.788576877" watchObservedRunningTime="2026-01-28 06:51:05.37428979 +0000 UTC m=+36.789949950" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.375448 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.375481 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.375491 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.375508 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.375518 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T06:51:05Z","lastTransitionTime":"2026-01-28T06:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.403755 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mng44" podStartSLOduration=16.403730802 podStartE2EDuration="16.403730802s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:05.388114689 +0000 UTC m=+36.803774859" watchObservedRunningTime="2026-01-28 06:51:05.403730802 +0000 UTC m=+36.819390962" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.404236 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6pj9t" podStartSLOduration=16.404230884 podStartE2EDuration="16.404230884s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:05.403572008 +0000 UTC m=+36.819232178" watchObservedRunningTime="2026-01-28 06:51:05.404230884 +0000 UTC m=+36.819891034" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.425248 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=15.425230677 podStartE2EDuration="15.425230677s" podCreationTimestamp="2026-01-28 06:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:05.423723691 +0000 UTC m=+36.839383851" watchObservedRunningTime="2026-01-28 06:51:05.425230677 +0000 UTC m=+36.840890837" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.426083 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h"] Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.426532 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.428882 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.430499 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.430733 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.431774 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.476197 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hmlx4" podStartSLOduration=16.476171785 podStartE2EDuration="16.476171785s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:05.4755858 +0000 UTC m=+36.891245960" watchObservedRunningTime="2026-01-28 06:51:05.476171785 +0000 UTC m=+36.891831965" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.523951 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c213f002-aa3a-49d8-b866-ea00dda64f4a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jkv9h\" (UID: \"c213f002-aa3a-49d8-b866-ea00dda64f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.524054 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c213f002-aa3a-49d8-b866-ea00dda64f4a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jkv9h\" (UID: \"c213f002-aa3a-49d8-b866-ea00dda64f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.524153 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c213f002-aa3a-49d8-b866-ea00dda64f4a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jkv9h\" (UID: \"c213f002-aa3a-49d8-b866-ea00dda64f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.524175 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c213f002-aa3a-49d8-b866-ea00dda64f4a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jkv9h\" (UID: \"c213f002-aa3a-49d8-b866-ea00dda64f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.524247 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c213f002-aa3a-49d8-b866-ea00dda64f4a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jkv9h\" (UID: \"c213f002-aa3a-49d8-b866-ea00dda64f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.581182 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2wlgk" podStartSLOduration=17.581160264 podStartE2EDuration="17.581160264s" podCreationTimestamp="2026-01-28 06:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:05.567152781 +0000 UTC m=+36.982812941" watchObservedRunningTime="2026-01-28 06:51:05.581160264 +0000 UTC m=+36.996820424" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.625178 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c213f002-aa3a-49d8-b866-ea00dda64f4a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jkv9h\" (UID: \"c213f002-aa3a-49d8-b866-ea00dda64f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.625221 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c213f002-aa3a-49d8-b866-ea00dda64f4a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jkv9h\" (UID: \"c213f002-aa3a-49d8-b866-ea00dda64f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.625238 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c213f002-aa3a-49d8-b866-ea00dda64f4a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jkv9h\" (UID: \"c213f002-aa3a-49d8-b866-ea00dda64f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.625267 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c213f002-aa3a-49d8-b866-ea00dda64f4a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jkv9h\" (UID: \"c213f002-aa3a-49d8-b866-ea00dda64f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.625304 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c213f002-aa3a-49d8-b866-ea00dda64f4a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jkv9h\" (UID: \"c213f002-aa3a-49d8-b866-ea00dda64f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.625364 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c213f002-aa3a-49d8-b866-ea00dda64f4a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jkv9h\" (UID: \"c213f002-aa3a-49d8-b866-ea00dda64f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.625400 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c213f002-aa3a-49d8-b866-ea00dda64f4a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jkv9h\" (UID: \"c213f002-aa3a-49d8-b866-ea00dda64f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.626087 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c213f002-aa3a-49d8-b866-ea00dda64f4a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jkv9h\" (UID: \"c213f002-aa3a-49d8-b866-ea00dda64f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.635186 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c213f002-aa3a-49d8-b866-ea00dda64f4a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jkv9h\" (UID: \"c213f002-aa3a-49d8-b866-ea00dda64f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.640890 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c213f002-aa3a-49d8-b866-ea00dda64f4a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jkv9h\" (UID: \"c213f002-aa3a-49d8-b866-ea00dda64f4a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" Jan 28 06:51:05 crc kubenswrapper[4776]: I0128 06:51:05.741653 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" Jan 28 06:51:06 crc kubenswrapper[4776]: I0128 06:51:06.227756 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" event={"ID":"c213f002-aa3a-49d8-b866-ea00dda64f4a","Type":"ContainerStarted","Data":"34274634f56fd6cbafb774100c9a844f6632b34c350d5d6458d4961db39966ac"} Jan 28 06:51:06 crc kubenswrapper[4776]: I0128 06:51:06.228017 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" event={"ID":"c213f002-aa3a-49d8-b866-ea00dda64f4a","Type":"ContainerStarted","Data":"50964f84e74220838007524d3f6935400ab85c06acde5e9356be7ce078c4896d"} Jan 28 06:51:06 crc kubenswrapper[4776]: I0128 06:51:06.229019 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"796c4056e0ba70bc5bad843414d2dfe3516f139b87670425e46fcda05236ad36"} Jan 28 06:51:06 crc kubenswrapper[4776]: I0128 06:51:06.253367 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jkv9h" podStartSLOduration=18.253346777 podStartE2EDuration="18.253346777s" podCreationTimestamp="2026-01-28 06:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:06.241572039 +0000 UTC m=+37.657232199" watchObservedRunningTime="2026-01-28 06:51:06.253346777 +0000 UTC m=+37.669006937" Jan 28 06:51:06 crc kubenswrapper[4776]: I0128 06:51:06.270202 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 17:45:00.755656483 +0000 UTC Jan 28 06:51:06 crc kubenswrapper[4776]: I0128 06:51:06.270271 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 28 06:51:06 crc kubenswrapper[4776]: I0128 06:51:06.278619 4776 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 28 06:51:06 crc kubenswrapper[4776]: I0128 06:51:06.303713 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:51:06 crc kubenswrapper[4776]: I0128 06:51:06.303791 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:51:06 crc kubenswrapper[4776]: E0128 06:51:06.304101 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 06:51:06 crc kubenswrapper[4776]: E0128 06:51:06.304130 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 06:51:06 crc kubenswrapper[4776]: I0128 06:51:06.360266 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d5hf2"] Jan 28 06:51:06 crc kubenswrapper[4776]: I0128 06:51:06.360400 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5hf2" Jan 28 06:51:06 crc kubenswrapper[4776]: E0128 06:51:06.360479 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5hf2" podUID="3819a037-a2a1-433f-884e-84bead904558" Jan 28 06:51:07 crc kubenswrapper[4776]: I0128 06:51:07.242577 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3819a037-a2a1-433f-884e-84bead904558-metrics-certs\") pod \"network-metrics-daemon-d5hf2\" (UID: \"3819a037-a2a1-433f-884e-84bead904558\") " pod="openshift-multus/network-metrics-daemon-d5hf2" Jan 28 06:51:07 crc kubenswrapper[4776]: E0128 06:51:07.242730 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 06:51:07 crc kubenswrapper[4776]: E0128 06:51:07.243103 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3819a037-a2a1-433f-884e-84bead904558-metrics-certs podName:3819a037-a2a1-433f-884e-84bead904558 nodeName:}" failed. No retries permitted until 2026-01-28 06:51:11.243083983 +0000 UTC m=+42.658744153 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3819a037-a2a1-433f-884e-84bead904558-metrics-certs") pod "network-metrics-daemon-d5hf2" (UID: "3819a037-a2a1-433f-884e-84bead904558") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 06:51:07 crc kubenswrapper[4776]: I0128 06:51:07.304322 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:51:07 crc kubenswrapper[4776]: I0128 06:51:07.304594 4776 scope.go:117] "RemoveContainer" containerID="84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17" Jan 28 06:51:07 crc kubenswrapper[4776]: E0128 06:51:07.304639 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 06:51:08 crc kubenswrapper[4776]: I0128 06:51:08.235087 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 28 06:51:08 crc kubenswrapper[4776]: I0128 06:51:08.236653 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c2b1f771b5dea98c61f6affa0c5fabc211fba22627c2c44b77802a6114621eea"} Jan 28 06:51:08 crc kubenswrapper[4776]: I0128 06:51:08.236911 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:51:08 crc kubenswrapper[4776]: I0128 06:51:08.263138 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.26312048 podStartE2EDuration="20.26312048s" podCreationTimestamp="2026-01-28 06:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:08.262648669 +0000 UTC m=+39.678308829" watchObservedRunningTime="2026-01-28 06:51:08.26312048 +0000 UTC m=+39.678780640" Jan 28 06:51:08 crc kubenswrapper[4776]: I0128 06:51:08.304248 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:51:08 crc kubenswrapper[4776]: I0128 06:51:08.304302 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5hf2" Jan 28 06:51:08 crc kubenswrapper[4776]: E0128 06:51:08.304353 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 06:51:08 crc kubenswrapper[4776]: I0128 06:51:08.304378 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:51:08 crc kubenswrapper[4776]: E0128 06:51:08.304462 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d5hf2" podUID="3819a037-a2a1-433f-884e-84bead904558" Jan 28 06:51:08 crc kubenswrapper[4776]: E0128 06:51:08.304533 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 06:51:08 crc kubenswrapper[4776]: I0128 06:51:08.963957 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 28 06:51:08 crc kubenswrapper[4776]: I0128 06:51:08.964156 4776 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Jan 28 06:51:08 crc kubenswrapper[4776]: I0128 06:51:08.997745 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9fmw6"] Jan 28 06:51:08 crc kubenswrapper[4776]: I0128 06:51:08.998226 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:08 crc kubenswrapper[4776]: W0128 06:51:08.999712 4776 reflector.go:561] object-"openshift-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Jan 28 06:51:08 crc kubenswrapper[4776]: E0128 06:51:08.999765 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.003528 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-wjw44"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.004127 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wjw44" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.007324 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwz2q"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.009132 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwz2q" Jan 28 06:51:09 crc kubenswrapper[4776]: W0128 06:51:09.010508 4776 reflector.go:561] object-"openshift-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Jan 28 06:51:09 crc kubenswrapper[4776]: W0128 06:51:09.010566 4776 reflector.go:561] object-"openshift-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Jan 28 06:51:09 crc kubenswrapper[4776]: W0128 06:51:09.010612 4776 reflector.go:561] object-"openshift-controller-manager"/"openshift-global-ca": failed to list *v1.ConfigMap: configmaps "openshift-global-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Jan 28 06:51:09 crc kubenswrapper[4776]: E0128 06:51:09.010608 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 06:51:09 crc kubenswrapper[4776]: E0128 06:51:09.010626 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-global-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-global-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 06:51:09 crc kubenswrapper[4776]: E0128 06:51:09.010567 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 06:51:09 crc kubenswrapper[4776]: W0128 06:51:09.010665 4776 reflector.go:561] object-"openshift-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Jan 28 06:51:09 crc kubenswrapper[4776]: W0128 06:51:09.010676 4776 reflector.go:561] object-"openshift-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Jan 28 06:51:09 crc kubenswrapper[4776]: E0128 06:51:09.010692 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 06:51:09 crc kubenswrapper[4776]: E0128 06:51:09.010676 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 06:51:09 crc kubenswrapper[4776]: W0128 06:51:09.010737 4776 reflector.go:561] object-"openshift-console"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Jan 28 06:51:09 crc kubenswrapper[4776]: W0128 06:51:09.010750 4776 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-msq4c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Jan 28 06:51:09 crc kubenswrapper[4776]: E0128 06:51:09.010764 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-msq4c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-msq4c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 06:51:09 crc kubenswrapper[4776]: E0128 06:51:09.010750 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 06:51:09 crc kubenswrapper[4776]: W0128 06:51:09.010855 4776 reflector.go:561] object-"openshift-console"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Jan 28 06:51:09 crc kubenswrapper[4776]: E0128 06:51:09.010868 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 06:51:09 crc kubenswrapper[4776]: W0128 06:51:09.010890 4776 reflector.go:561] object-"openshift-console"/"default-dockercfg-chnjx": failed to list *v1.Secret: secrets "default-dockercfg-chnjx" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Jan 28 06:51:09 crc kubenswrapper[4776]: E0128 06:51:09.010906 4776 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"default-dockercfg-chnjx\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-chnjx\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.013861 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.013998 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fz854"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.014499 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.014590 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.014688 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.015661 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.015951 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xm8h7"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.017090 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k8nsq"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.016131 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fz854" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.017756 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-p5b6p"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.017326 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.018036 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.018124 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.018223 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k8nsq" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.018592 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.018945 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.019238 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.020382 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.020633 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.020811 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.021062 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.023368 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.025982 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.026267 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.026507 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.026910 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.027163 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.027351 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.027943 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.027979 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.040521 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.040923 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.041822 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.042086 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.042223 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.042716 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.053457 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.053854 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.054672 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.054787 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.055042 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.055213 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.055221 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.055597 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nltsm"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.055821 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.055851 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.056229 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.056676 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.056826 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nltsm" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.056686 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.056768 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.057211 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.058346 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv9ms"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.059413 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv9ms" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.060032 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8ddhh"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.060715 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.061044 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.062529 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wjm9m"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.063263 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.063539 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.063726 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.068043 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.068161 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.069619 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.070415 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vjkh5"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.070912 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.071352 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.076115 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.081787 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m58q2"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.082232 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-g79fj"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.082566 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5p2jq"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.082908 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.083139 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m58q2" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.083280 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g79fj" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.084051 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.084312 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.084849 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.085063 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.085146 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.085206 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.085356 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.085605 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.085627 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.085783 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.086142 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.086175 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.087168 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.085755 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.098024 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.098353 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.098026 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.098671 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.098982 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.102872 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.103067 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.103466 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.104664 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.106365 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.106529 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.106979 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.114854 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.115202 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.115188 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.115388 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.115732 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k9524"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.115798 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.115993 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.116219 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.116238 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.116880 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.120595 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.125000 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vxslm"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.125597 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vxslm" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.125897 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be5bb707-a7a1-4b88-a75e-0093c14a4764-serving-cert\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.125922 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-client-ca\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.125979 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.126001 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-config\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.126019 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clfbw\" (UniqueName: \"kubernetes.io/projected/be5bb707-a7a1-4b88-a75e-0093c14a4764-kube-api-access-clfbw\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.126106 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.126420 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.126947 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.127291 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.127293 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.127793 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-tcdcf"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.128446 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.127833 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.128908 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.129457 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.130459 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.130613 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.131298 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.131520 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.132697 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.132733 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.133446 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.133833 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5bxjg"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.134561 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2pxtc"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.134571 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5bxjg" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.135330 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2pxtc" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.139153 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnvp9"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.139718 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.139840 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.140145 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.140345 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnvp9" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.140892 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.142298 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.142721 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.144371 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lft68"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.145198 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lft68" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.145205 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sznv4"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.145924 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sznv4" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.146620 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.147585 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.147863 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.148195 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.148777 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.149280 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2vn7f"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.149963 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.153292 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.154085 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.154802 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dr6wg"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.159646 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.160388 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dr6wg" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.162446 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.169171 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-g4qds"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.170376 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.173479 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-h4g2z"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.173782 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-g4qds" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.176268 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-h4g2z" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.180045 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.180399 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwz2q"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.181085 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv9ms"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.182115 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wjw44"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.182432 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.182968 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.184138 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8b4t7"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.184923 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8b4t7" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.185212 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fz854"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.186609 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vxslm"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.188219 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2znv8"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.188992 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2znv8" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.189238 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xm8h7"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.190238 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-p5b6p"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.191401 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vjkh5"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.192646 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.193895 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8ddhh"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.195441 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5bxjg"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.196181 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pkbgp"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.196864 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pkbgp" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.197092 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnvp9"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.198223 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.199468 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sznv4"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.200846 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k8nsq"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.201111 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.201946 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.204280 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5p2jq"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.204791 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-g79fj"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.206034 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wjm9m"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.206654 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9fmw6"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.207510 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nltsm"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.208440 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2pxtc"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.209413 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.210306 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k9524"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.220040 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.223909 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.224909 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.226638 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4835eb43-8c9b-4153-9b80-02aeeab54cef-trusted-ca\") pod \"console-operator-58897d9998-fz854\" (UID: \"4835eb43-8c9b-4153-9b80-02aeeab54cef\") " pod="openshift-console-operator/console-operator-58897d9998-fz854" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.226693 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be5bb707-a7a1-4b88-a75e-0093c14a4764-serving-cert\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.226941 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb7ccb2-9c11-4273-9888-f45aea05803d-config\") pod \"machine-api-operator-5694c8668f-xm8h7\" (UID: \"4fb7ccb2-9c11-4273-9888-f45aea05803d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.226975 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4fb7ccb2-9c11-4273-9888-f45aea05803d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xm8h7\" (UID: \"4fb7ccb2-9c11-4273-9888-f45aea05803d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.227006 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48500af3-a3ce-4ca8-a6bd-379cb2a0129a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hwz2q\" (UID: \"48500af3-a3ce-4ca8-a6bd-379cb2a0129a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwz2q" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.227039 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4835eb43-8c9b-4153-9b80-02aeeab54cef-serving-cert\") pod \"console-operator-58897d9998-fz854\" (UID: \"4835eb43-8c9b-4153-9b80-02aeeab54cef\") " pod="openshift-console-operator/console-operator-58897d9998-fz854" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.227064 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djb2w\" (UniqueName: \"kubernetes.io/projected/273d91f5-04f4-44b2-8ccf-843e85ea7c7b-kube-api-access-djb2w\") pod \"openshift-controller-manager-operator-756b6f6bc6-k8nsq\" (UID: \"273d91f5-04f4-44b2-8ccf-843e85ea7c7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k8nsq" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.227089 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-config\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.227131 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7fgw\" (UniqueName: \"kubernetes.io/projected/4835eb43-8c9b-4153-9b80-02aeeab54cef-kube-api-access-m7fgw\") pod \"console-operator-58897d9998-fz854\" (UID: \"4835eb43-8c9b-4153-9b80-02aeeab54cef\") " pod="openshift-console-operator/console-operator-58897d9998-fz854" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.227157 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/273d91f5-04f4-44b2-8ccf-843e85ea7c7b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-k8nsq\" (UID: \"273d91f5-04f4-44b2-8ccf-843e85ea7c7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k8nsq" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.227180 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4835eb43-8c9b-4153-9b80-02aeeab54cef-config\") pod \"console-operator-58897d9998-fz854\" (UID: \"4835eb43-8c9b-4153-9b80-02aeeab54cef\") " pod="openshift-console-operator/console-operator-58897d9998-fz854" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.227203 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4fb7ccb2-9c11-4273-9888-f45aea05803d-images\") pod \"machine-api-operator-5694c8668f-xm8h7\" (UID: \"4fb7ccb2-9c11-4273-9888-f45aea05803d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.227230 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87679\" (UniqueName: \"kubernetes.io/projected/48500af3-a3ce-4ca8-a6bd-379cb2a0129a-kube-api-access-87679\") pod \"openshift-apiserver-operator-796bbdcf4f-hwz2q\" (UID: \"48500af3-a3ce-4ca8-a6bd-379cb2a0129a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwz2q" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.227260 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-client-ca\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.227291 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5737e5b8-0513-4c1b-b4a4-6f5812f83d4b-metrics-tls\") pod \"dns-operator-744455d44c-nltsm\" (UID: \"5737e5b8-0513-4c1b-b4a4-6f5812f83d4b\") " pod="openshift-dns-operator/dns-operator-744455d44c-nltsm" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.227323 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vtng\" (UniqueName: \"kubernetes.io/projected/4fb7ccb2-9c11-4273-9888-f45aea05803d-kube-api-access-9vtng\") pod \"machine-api-operator-5694c8668f-xm8h7\" (UID: \"4fb7ccb2-9c11-4273-9888-f45aea05803d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.227367 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dclz\" (UniqueName: \"kubernetes.io/projected/5737e5b8-0513-4c1b-b4a4-6f5812f83d4b-kube-api-access-4dclz\") pod \"dns-operator-744455d44c-nltsm\" (UID: \"5737e5b8-0513-4c1b-b4a4-6f5812f83d4b\") " pod="openshift-dns-operator/dns-operator-744455d44c-nltsm" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.227396 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.227424 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48500af3-a3ce-4ca8-a6bd-379cb2a0129a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hwz2q\" (UID: \"48500af3-a3ce-4ca8-a6bd-379cb2a0129a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwz2q" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.227458 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clfbw\" (UniqueName: \"kubernetes.io/projected/be5bb707-a7a1-4b88-a75e-0093c14a4764-kube-api-access-clfbw\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.227488 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/273d91f5-04f4-44b2-8ccf-843e85ea7c7b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-k8nsq\" (UID: \"273d91f5-04f4-44b2-8ccf-843e85ea7c7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k8nsq" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.228443 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lft68"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.229786 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m58q2"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.229840 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.231699 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-xlbwv"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.234292 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.235109 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.240537 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.240773 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2vn7f"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.244581 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-h4g2z"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.245700 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dr6wg"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.246769 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.247878 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-g4qds"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.249155 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2znv8"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.250127 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pkbgp"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.251127 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4ndtq"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.252157 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4ndtq"] Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.252253 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.261114 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.280848 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.300459 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.303630 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.321458 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.327999 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4835eb43-8c9b-4153-9b80-02aeeab54cef-config\") pod \"console-operator-58897d9998-fz854\" (UID: \"4835eb43-8c9b-4153-9b80-02aeeab54cef\") " pod="openshift-console-operator/console-operator-58897d9998-fz854" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.328050 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4fb7ccb2-9c11-4273-9888-f45aea05803d-images\") pod \"machine-api-operator-5694c8668f-xm8h7\" (UID: \"4fb7ccb2-9c11-4273-9888-f45aea05803d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.328079 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87679\" (UniqueName: \"kubernetes.io/projected/48500af3-a3ce-4ca8-a6bd-379cb2a0129a-kube-api-access-87679\") pod \"openshift-apiserver-operator-796bbdcf4f-hwz2q\" (UID: \"48500af3-a3ce-4ca8-a6bd-379cb2a0129a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwz2q" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.328123 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vtng\" (UniqueName: \"kubernetes.io/projected/4fb7ccb2-9c11-4273-9888-f45aea05803d-kube-api-access-9vtng\") pod \"machine-api-operator-5694c8668f-xm8h7\" (UID: \"4fb7ccb2-9c11-4273-9888-f45aea05803d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.328141 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5737e5b8-0513-4c1b-b4a4-6f5812f83d4b-metrics-tls\") pod \"dns-operator-744455d44c-nltsm\" (UID: \"5737e5b8-0513-4c1b-b4a4-6f5812f83d4b\") " pod="openshift-dns-operator/dns-operator-744455d44c-nltsm" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.328170 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dclz\" (UniqueName: \"kubernetes.io/projected/5737e5b8-0513-4c1b-b4a4-6f5812f83d4b-kube-api-access-4dclz\") pod \"dns-operator-744455d44c-nltsm\" (UID: \"5737e5b8-0513-4c1b-b4a4-6f5812f83d4b\") " pod="openshift-dns-operator/dns-operator-744455d44c-nltsm" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.328211 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48500af3-a3ce-4ca8-a6bd-379cb2a0129a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hwz2q\" (UID: \"48500af3-a3ce-4ca8-a6bd-379cb2a0129a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwz2q" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.328238 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/273d91f5-04f4-44b2-8ccf-843e85ea7c7b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-k8nsq\" (UID: \"273d91f5-04f4-44b2-8ccf-843e85ea7c7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k8nsq" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.328287 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4835eb43-8c9b-4153-9b80-02aeeab54cef-trusted-ca\") pod \"console-operator-58897d9998-fz854\" (UID: \"4835eb43-8c9b-4153-9b80-02aeeab54cef\") " pod="openshift-console-operator/console-operator-58897d9998-fz854" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.328323 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb7ccb2-9c11-4273-9888-f45aea05803d-config\") pod \"machine-api-operator-5694c8668f-xm8h7\" (UID: \"4fb7ccb2-9c11-4273-9888-f45aea05803d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.328361 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4fb7ccb2-9c11-4273-9888-f45aea05803d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xm8h7\" (UID: \"4fb7ccb2-9c11-4273-9888-f45aea05803d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.328379 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48500af3-a3ce-4ca8-a6bd-379cb2a0129a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hwz2q\" (UID: \"48500af3-a3ce-4ca8-a6bd-379cb2a0129a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwz2q" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.328403 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4835eb43-8c9b-4153-9b80-02aeeab54cef-serving-cert\") pod \"console-operator-58897d9998-fz854\" (UID: \"4835eb43-8c9b-4153-9b80-02aeeab54cef\") " pod="openshift-console-operator/console-operator-58897d9998-fz854" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.328446 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djb2w\" (UniqueName: \"kubernetes.io/projected/273d91f5-04f4-44b2-8ccf-843e85ea7c7b-kube-api-access-djb2w\") pod \"openshift-controller-manager-operator-756b6f6bc6-k8nsq\" (UID: \"273d91f5-04f4-44b2-8ccf-843e85ea7c7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k8nsq" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.328483 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7fgw\" (UniqueName: \"kubernetes.io/projected/4835eb43-8c9b-4153-9b80-02aeeab54cef-kube-api-access-m7fgw\") pod \"console-operator-58897d9998-fz854\" (UID: \"4835eb43-8c9b-4153-9b80-02aeeab54cef\") " pod="openshift-console-operator/console-operator-58897d9998-fz854" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.328514 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/273d91f5-04f4-44b2-8ccf-843e85ea7c7b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-k8nsq\" (UID: \"273d91f5-04f4-44b2-8ccf-843e85ea7c7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k8nsq" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.328912 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4835eb43-8c9b-4153-9b80-02aeeab54cef-config\") pod \"console-operator-58897d9998-fz854\" (UID: \"4835eb43-8c9b-4153-9b80-02aeeab54cef\") " pod="openshift-console-operator/console-operator-58897d9998-fz854" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.329231 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/273d91f5-04f4-44b2-8ccf-843e85ea7c7b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-k8nsq\" (UID: \"273d91f5-04f4-44b2-8ccf-843e85ea7c7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k8nsq" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.329618 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48500af3-a3ce-4ca8-a6bd-379cb2a0129a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hwz2q\" (UID: \"48500af3-a3ce-4ca8-a6bd-379cb2a0129a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwz2q" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.329900 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4fb7ccb2-9c11-4273-9888-f45aea05803d-images\") pod \"machine-api-operator-5694c8668f-xm8h7\" (UID: \"4fb7ccb2-9c11-4273-9888-f45aea05803d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.331036 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb7ccb2-9c11-4273-9888-f45aea05803d-config\") pod \"machine-api-operator-5694c8668f-xm8h7\" (UID: \"4fb7ccb2-9c11-4273-9888-f45aea05803d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.331107 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4835eb43-8c9b-4153-9b80-02aeeab54cef-trusted-ca\") pod \"console-operator-58897d9998-fz854\" (UID: \"4835eb43-8c9b-4153-9b80-02aeeab54cef\") " pod="openshift-console-operator/console-operator-58897d9998-fz854" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.333430 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4835eb43-8c9b-4153-9b80-02aeeab54cef-serving-cert\") pod \"console-operator-58897d9998-fz854\" (UID: \"4835eb43-8c9b-4153-9b80-02aeeab54cef\") " pod="openshift-console-operator/console-operator-58897d9998-fz854" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.334125 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5737e5b8-0513-4c1b-b4a4-6f5812f83d4b-metrics-tls\") pod \"dns-operator-744455d44c-nltsm\" (UID: \"5737e5b8-0513-4c1b-b4a4-6f5812f83d4b\") " pod="openshift-dns-operator/dns-operator-744455d44c-nltsm" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.334479 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/273d91f5-04f4-44b2-8ccf-843e85ea7c7b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-k8nsq\" (UID: \"273d91f5-04f4-44b2-8ccf-843e85ea7c7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k8nsq" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.334563 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4fb7ccb2-9c11-4273-9888-f45aea05803d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xm8h7\" (UID: \"4fb7ccb2-9c11-4273-9888-f45aea05803d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.335234 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48500af3-a3ce-4ca8-a6bd-379cb2a0129a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hwz2q\" (UID: \"48500af3-a3ce-4ca8-a6bd-379cb2a0129a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwz2q" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.341043 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.364750 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.384484 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.401099 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.440247 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.464108 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.480291 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.500864 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.521220 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.541355 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.560738 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.580871 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.602189 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.623059 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.640238 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.659470 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.701253 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.720644 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.740577 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.761042 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.780895 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.800455 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.820025 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.840773 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.861123 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.881060 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.905847 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.920351 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.940933 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.960360 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 28 06:51:09 crc kubenswrapper[4776]: I0128 06:51:09.980863 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.000851 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.020214 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.040876 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.059771 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.080121 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.101368 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.120536 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.140675 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.159339 4776 request.go:700] Waited for 1.018799476s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver-operator/secrets?fieldSelector=metadata.name%3Dkube-apiserver-operator-serving-cert&limit=500&resourceVersion=0 Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.161105 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.180809 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.200862 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.221284 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 28 06:51:10 crc kubenswrapper[4776]: E0128 06:51:10.228089 4776 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Jan 28 06:51:10 crc kubenswrapper[4776]: E0128 06:51:10.228148 4776 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 28 06:51:10 crc kubenswrapper[4776]: E0128 06:51:10.228091 4776 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Jan 28 06:51:10 crc kubenswrapper[4776]: E0128 06:51:10.228232 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-config podName:be5bb707-a7a1-4b88-a75e-0093c14a4764 nodeName:}" failed. No retries permitted until 2026-01-28 06:51:10.72819283 +0000 UTC m=+42.143853030 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-config") pod "controller-manager-879f6c89f-9fmw6" (UID: "be5bb707-a7a1-4b88-a75e-0093c14a4764") : failed to sync configmap cache: timed out waiting for the condition Jan 28 06:51:10 crc kubenswrapper[4776]: E0128 06:51:10.228326 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5bb707-a7a1-4b88-a75e-0093c14a4764-serving-cert podName:be5bb707-a7a1-4b88-a75e-0093c14a4764 nodeName:}" failed. No retries permitted until 2026-01-28 06:51:10.728301853 +0000 UTC m=+42.143962023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/be5bb707-a7a1-4b88-a75e-0093c14a4764-serving-cert") pod "controller-manager-879f6c89f-9fmw6" (UID: "be5bb707-a7a1-4b88-a75e-0093c14a4764") : failed to sync secret cache: timed out waiting for the condition Jan 28 06:51:10 crc kubenswrapper[4776]: E0128 06:51:10.228349 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-proxy-ca-bundles podName:be5bb707-a7a1-4b88-a75e-0093c14a4764 nodeName:}" failed. No retries permitted until 2026-01-28 06:51:10.728337024 +0000 UTC m=+42.143997194 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-proxy-ca-bundles") pod "controller-manager-879f6c89f-9fmw6" (UID: "be5bb707-a7a1-4b88-a75e-0093c14a4764") : failed to sync configmap cache: timed out waiting for the condition Jan 28 06:51:10 crc kubenswrapper[4776]: E0128 06:51:10.228714 4776 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Jan 28 06:51:10 crc kubenswrapper[4776]: E0128 06:51:10.228785 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-client-ca podName:be5bb707-a7a1-4b88-a75e-0093c14a4764 nodeName:}" failed. No retries permitted until 2026-01-28 06:51:10.728761824 +0000 UTC m=+42.144422074 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-client-ca") pod "controller-manager-879f6c89f-9fmw6" (UID: "be5bb707-a7a1-4b88-a75e-0093c14a4764") : failed to sync configmap cache: timed out waiting for the condition Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.241391 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.261207 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.281322 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.301347 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.304481 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.304700 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5hf2" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.304754 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.322214 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.340857 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.360414 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.381126 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.401097 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.420707 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.441296 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.468795 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.480320 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.500829 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.521292 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.545015 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.561653 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.582228 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.601981 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.621230 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.642068 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.661959 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.682214 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.701826 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.721363 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.742443 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.742994 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be5bb707-a7a1-4b88-a75e-0093c14a4764-serving-cert\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.743128 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-config\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.743423 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-client-ca\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.743626 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.761993 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.783182 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.802352 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.822539 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.843768 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.861414 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.881693 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.902163 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.921696 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.940877 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.960609 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 28 06:51:10 crc kubenswrapper[4776]: I0128 06:51:10.981792 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.000418 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.041528 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.061902 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.081851 4776 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.103242 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.122328 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.141218 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.159647 4776 request.go:700] Waited for 1.829585092s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/serviceaccounts/openshift-apiserver-operator/token Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.185780 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87679\" (UniqueName: \"kubernetes.io/projected/48500af3-a3ce-4ca8-a6bd-379cb2a0129a-kube-api-access-87679\") pod \"openshift-apiserver-operator-796bbdcf4f-hwz2q\" (UID: \"48500af3-a3ce-4ca8-a6bd-379cb2a0129a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwz2q" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.203423 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vtng\" (UniqueName: \"kubernetes.io/projected/4fb7ccb2-9c11-4273-9888-f45aea05803d-kube-api-access-9vtng\") pod \"machine-api-operator-5694c8668f-xm8h7\" (UID: \"4fb7ccb2-9c11-4273-9888-f45aea05803d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.225643 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djb2w\" (UniqueName: \"kubernetes.io/projected/273d91f5-04f4-44b2-8ccf-843e85ea7c7b-kube-api-access-djb2w\") pod \"openshift-controller-manager-operator-756b6f6bc6-k8nsq\" (UID: \"273d91f5-04f4-44b2-8ccf-843e85ea7c7b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k8nsq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.234869 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.250566 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3819a037-a2a1-433f-884e-84bead904558-metrics-certs\") pod \"network-metrics-daemon-d5hf2\" (UID: \"3819a037-a2a1-433f-884e-84bead904558\") " pod="openshift-multus/network-metrics-daemon-d5hf2" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.264817 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k8nsq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.280425 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7fgw\" (UniqueName: \"kubernetes.io/projected/4835eb43-8c9b-4153-9b80-02aeeab54cef-kube-api-access-m7fgw\") pod \"console-operator-58897d9998-fz854\" (UID: \"4835eb43-8c9b-4153-9b80-02aeeab54cef\") " pod="openshift-console-operator/console-operator-58897d9998-fz854" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.286718 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dclz\" (UniqueName: \"kubernetes.io/projected/5737e5b8-0513-4c1b-b4a4-6f5812f83d4b-kube-api-access-4dclz\") pod \"dns-operator-744455d44c-nltsm\" (UID: \"5737e5b8-0513-4c1b-b4a4-6f5812f83d4b\") " pod="openshift-dns-operator/dns-operator-744455d44c-nltsm" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.302628 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.311839 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nltsm" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.321251 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.341020 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.344599 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-config\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.351475 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v68k2\" (UniqueName: \"kubernetes.io/projected/0b392476-ce74-4f7f-a12f-920531623ef6-kube-api-access-v68k2\") pod \"authentication-operator-69f744f599-5p2jq\" (UID: \"0b392476-ce74-4f7f-a12f-920531623ef6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.351527 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/56d29578-4c86-4128-adda-2fd5398645a5-auth-proxy-config\") pod \"machine-approver-56656f9798-qqqv5\" (UID: \"56d29578-4c86-4128-adda-2fd5398645a5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.351572 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngs2k\" (UniqueName: \"kubernetes.io/projected/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-kube-api-access-ngs2k\") pod \"route-controller-manager-6576b87f9c-zcm9b\" (UID: \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.351594 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/39ece0d0-d290-4488-9111-f4784bebc3b2-etcd-client\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.351618 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-oauth-config\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.351640 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6c5e07b-212c-404f-bfa8-e96c62028a2a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vxslm\" (UID: \"d6c5e07b-212c-404f-bfa8-e96c62028a2a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vxslm" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.351686 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db28956e-c117-4203-ba6e-c6eadf3908f7-audit-policies\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.351781 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39ece0d0-d290-4488-9111-f4784bebc3b2-audit-dir\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.351870 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-registry-tls\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.351900 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.351934 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f22939da-a96d-4aab-8446-5452654bac1e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-g79fj\" (UID: \"f22939da-a96d-4aab-8446-5452654bac1e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g79fj" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.351959 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-serving-cert\") pod \"route-controller-manager-6576b87f9c-zcm9b\" (UID: \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.351986 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/39ece0d0-d290-4488-9111-f4784bebc3b2-encryption-config\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352031 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57b5f9b9-549e-443e-9fc5-eb377698f57b-trusted-ca\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352058 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57b5f9b9-549e-443e-9fc5-eb377698f57b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352086 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/db28956e-c117-4203-ba6e-c6eadf3908f7-encryption-config\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352123 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksvxw\" (UniqueName: \"kubernetes.io/projected/2e9f99f8-da42-4c58-a58f-9ebd2a12fca3-kube-api-access-ksvxw\") pod \"cluster-samples-operator-665b6dd947-vv9ms\" (UID: \"2e9f99f8-da42-4c58-a58f-9ebd2a12fca3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv9ms" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352147 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29614791-cdee-451e-b670-ac7f3d34d9bb-config\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352188 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39ece0d0-d290-4488-9111-f4784bebc3b2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352212 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57b5f9b9-549e-443e-9fc5-eb377698f57b-registry-certificates\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352238 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352275 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352307 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c673c63-29c5-42eb-a59a-1350e12bffd7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-r48r6\" (UID: \"9c673c63-29c5-42eb-a59a-1350e12bffd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352333 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/56d29578-4c86-4128-adda-2fd5398645a5-machine-approver-tls\") pod \"machine-approver-56656f9798-qqqv5\" (UID: \"56d29578-4c86-4128-adda-2fd5398645a5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352367 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw2wp\" (UniqueName: \"kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-kube-api-access-fw2wp\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352391 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db28956e-c117-4203-ba6e-c6eadf3908f7-serving-cert\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352422 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74ck7\" (UniqueName: \"kubernetes.io/projected/43ce9486-c553-4d64-92fb-20402352c29f-kube-api-access-74ck7\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352451 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39ece0d0-d290-4488-9111-f4784bebc3b2-serving-cert\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352511 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/29614791-cdee-451e-b670-ac7f3d34d9bb-etcd-service-ca\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352562 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-oauth-serving-cert\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352590 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/db28956e-c117-4203-ba6e-c6eadf3908f7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352621 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-config\") pod \"route-controller-manager-6576b87f9c-zcm9b\" (UID: \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352675 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352706 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352728 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/39ece0d0-d290-4488-9111-f4784bebc3b2-node-pullsecrets\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352758 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352785 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db28956e-c117-4203-ba6e-c6eadf3908f7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352810 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352842 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/29614791-cdee-451e-b670-ac7f3d34d9bb-etcd-client\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352866 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c5e07b-212c-404f-bfa8-e96c62028a2a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vxslm\" (UID: \"d6c5e07b-212c-404f-bfa8-e96c62028a2a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vxslm" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352903 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39ece0d0-d290-4488-9111-f4784bebc3b2-config\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352949 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b392476-ce74-4f7f-a12f-920531623ef6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5p2jq\" (UID: \"0b392476-ce74-4f7f-a12f-920531623ef6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.352977 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-audit-policies\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353017 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/39ece0d0-d290-4488-9111-f4784bebc3b2-etcd-serving-ca\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353044 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f49sq\" (UniqueName: \"kubernetes.io/projected/53ea92d3-1ca4-4663-9a90-c9cb24c6bec1-kube-api-access-f49sq\") pod \"downloads-7954f5f757-wjw44\" (UID: \"53ea92d3-1ca4-4663-9a90-c9cb24c6bec1\") " pod="openshift-console/downloads-7954f5f757-wjw44" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353065 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-config\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353089 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353112 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqmnd\" (UniqueName: \"kubernetes.io/projected/f22939da-a96d-4aab-8446-5452654bac1e-kube-api-access-fqmnd\") pod \"openshift-config-operator-7777fb866f-g79fj\" (UID: \"f22939da-a96d-4aab-8446-5452654bac1e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g79fj" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353135 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/39ece0d0-d290-4488-9111-f4784bebc3b2-image-import-ca\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353157 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fe013e6-ff17-415a-af5a-c96be0fa82e6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m58q2\" (UID: \"0fe013e6-ff17-415a-af5a-c96be0fa82e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m58q2" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353182 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-service-ca\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353204 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7jvr\" (UniqueName: \"kubernetes.io/projected/db28956e-c117-4203-ba6e-c6eadf3908f7-kube-api-access-h7jvr\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353231 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28lh4\" (UniqueName: \"kubernetes.io/projected/dbdea8ef-a044-48ca-bfac-19023c9fb55d-kube-api-access-28lh4\") pod \"machine-config-controller-84d6567774-xq6gp\" (UID: \"dbdea8ef-a044-48ca-bfac-19023c9fb55d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353255 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdbgm\" (UniqueName: \"kubernetes.io/projected/56d29578-4c86-4128-adda-2fd5398645a5-kube-api-access-xdbgm\") pod \"machine-approver-56656f9798-qqqv5\" (UID: \"56d29578-4c86-4128-adda-2fd5398645a5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353286 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-bound-sa-token\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353307 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f22939da-a96d-4aab-8446-5452654bac1e-serving-cert\") pod \"openshift-config-operator-7777fb866f-g79fj\" (UID: \"f22939da-a96d-4aab-8446-5452654bac1e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g79fj" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353333 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353360 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/29614791-cdee-451e-b670-ac7f3d34d9bb-etcd-ca\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353387 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56d29578-4c86-4128-adda-2fd5398645a5-config\") pod \"machine-approver-56656f9798-qqqv5\" (UID: \"56d29578-4c86-4128-adda-2fd5398645a5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353408 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fe013e6-ff17-415a-af5a-c96be0fa82e6-config\") pod \"kube-controller-manager-operator-78b949d7b-m58q2\" (UID: \"0fe013e6-ff17-415a-af5a-c96be0fa82e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m58q2" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353436 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b392476-ce74-4f7f-a12f-920531623ef6-serving-cert\") pod \"authentication-operator-69f744f599-5p2jq\" (UID: \"0b392476-ce74-4f7f-a12f-920531623ef6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353497 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353537 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-serving-cert\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353579 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-client-ca\") pod \"route-controller-manager-6576b87f9c-zcm9b\" (UID: \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353622 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm97q\" (UniqueName: \"kubernetes.io/projected/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-kube-api-access-lm97q\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353651 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c673c63-29c5-42eb-a59a-1350e12bffd7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-r48r6\" (UID: \"9c673c63-29c5-42eb-a59a-1350e12bffd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353685 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66v5v\" (UniqueName: \"kubernetes.io/projected/9c673c63-29c5-42eb-a59a-1350e12bffd7-kube-api-access-66v5v\") pod \"cluster-image-registry-operator-dc59b4c8b-r48r6\" (UID: \"9c673c63-29c5-42eb-a59a-1350e12bffd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353718 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db28956e-c117-4203-ba6e-c6eadf3908f7-audit-dir\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353750 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqr8x\" (UniqueName: \"kubernetes.io/projected/39ece0d0-d290-4488-9111-f4784bebc3b2-kube-api-access-jqr8x\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353790 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57b5f9b9-549e-443e-9fc5-eb377698f57b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353820 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b392476-ce74-4f7f-a12f-920531623ef6-service-ca-bundle\") pod \"authentication-operator-69f744f599-5p2jq\" (UID: \"0b392476-ce74-4f7f-a12f-920531623ef6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353844 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353867 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z28ng\" (UniqueName: \"kubernetes.io/projected/29614791-cdee-451e-b670-ac7f3d34d9bb-kube-api-access-z28ng\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353893 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-trusted-ca-bundle\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353914 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b392476-ce74-4f7f-a12f-920531623ef6-config\") pod \"authentication-operator-69f744f599-5p2jq\" (UID: \"0b392476-ce74-4f7f-a12f-920531623ef6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353945 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e9f99f8-da42-4c58-a58f-9ebd2a12fca3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vv9ms\" (UID: \"2e9f99f8-da42-4c58-a58f-9ebd2a12fca3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv9ms" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353966 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/39ece0d0-d290-4488-9111-f4784bebc3b2-audit\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.353990 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29614791-cdee-451e-b670-ac7f3d34d9bb-serving-cert\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.354016 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dbdea8ef-a044-48ca-bfac-19023c9fb55d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xq6gp\" (UID: \"dbdea8ef-a044-48ca-bfac-19023c9fb55d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.354055 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fe013e6-ff17-415a-af5a-c96be0fa82e6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m58q2\" (UID: \"0fe013e6-ff17-415a-af5a-c96be0fa82e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m58q2" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.354076 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c5e07b-212c-404f-bfa8-e96c62028a2a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vxslm\" (UID: \"d6c5e07b-212c-404f-bfa8-e96c62028a2a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vxslm" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.354099 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c673c63-29c5-42eb-a59a-1350e12bffd7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-r48r6\" (UID: \"9c673c63-29c5-42eb-a59a-1350e12bffd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.354122 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dbdea8ef-a044-48ca-bfac-19023c9fb55d-proxy-tls\") pod \"machine-config-controller-84d6567774-xq6gp\" (UID: \"dbdea8ef-a044-48ca-bfac-19023c9fb55d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.354145 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/db28956e-c117-4203-ba6e-c6eadf3908f7-etcd-client\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.354167 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.354199 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/43ce9486-c553-4d64-92fb-20402352c29f-audit-dir\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: E0128 06:51:11.354724 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:11.854704914 +0000 UTC m=+43.270365084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.361367 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.381030 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.407133 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.420947 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.441313 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.447804 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3819a037-a2a1-433f-884e-84bead904558-metrics-certs\") pod \"network-metrics-daemon-d5hf2\" (UID: \"3819a037-a2a1-433f-884e-84bead904558\") " pod="openshift-multus/network-metrics-daemon-d5hf2" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.454822 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.455057 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-audit-policies\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.455100 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/071d005d-cd96-4f28-b644-982b0f846135-stats-auth\") pod \"router-default-5444994796-tcdcf\" (UID: \"071d005d-cd96-4f28-b644-982b0f846135\") " pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.455144 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f49sq\" (UniqueName: \"kubernetes.io/projected/53ea92d3-1ca4-4663-9a90-c9cb24c6bec1-kube-api-access-f49sq\") pod \"downloads-7954f5f757-wjw44\" (UID: \"53ea92d3-1ca4-4663-9a90-c9cb24c6bec1\") " pod="openshift-console/downloads-7954f5f757-wjw44" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.455176 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d26e3039-7460-49d9-8f89-637d57601639-csi-data-dir\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.455204 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqmnd\" (UniqueName: \"kubernetes.io/projected/f22939da-a96d-4aab-8446-5452654bac1e-kube-api-access-fqmnd\") pod \"openshift-config-operator-7777fb866f-g79fj\" (UID: \"f22939da-a96d-4aab-8446-5452654bac1e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g79fj" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.455229 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/39ece0d0-d290-4488-9111-f4784bebc3b2-image-import-ca\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.455252 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fe013e6-ff17-415a-af5a-c96be0fa82e6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m58q2\" (UID: \"0fe013e6-ff17-415a-af5a-c96be0fa82e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m58q2" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.455276 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1f7c9ee3-b72d-4af7-998f-cad0df531c31-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fnvp9\" (UID: \"1f7c9ee3-b72d-4af7-998f-cad0df531c31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnvp9" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.455303 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-service-ca\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.455326 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28lh4\" (UniqueName: \"kubernetes.io/projected/dbdea8ef-a044-48ca-bfac-19023c9fb55d-kube-api-access-28lh4\") pod \"machine-config-controller-84d6567774-xq6gp\" (UID: \"dbdea8ef-a044-48ca-bfac-19023c9fb55d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.455352 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdbgm\" (UniqueName: \"kubernetes.io/projected/56d29578-4c86-4128-adda-2fd5398645a5-kube-api-access-xdbgm\") pod \"machine-approver-56656f9798-qqqv5\" (UID: \"56d29578-4c86-4128-adda-2fd5398645a5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.455378 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f22939da-a96d-4aab-8446-5452654bac1e-serving-cert\") pod \"openshift-config-operator-7777fb866f-g79fj\" (UID: \"f22939da-a96d-4aab-8446-5452654bac1e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g79fj" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.455400 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6njzh\" (UniqueName: \"kubernetes.io/projected/8d8916b5-df67-4492-8c28-8f0d872a4997-kube-api-access-6njzh\") pod \"olm-operator-6b444d44fb-4djdr\" (UID: \"8d8916b5-df67-4492-8c28-8f0d872a4997\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.455426 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d4gb\" (UniqueName: \"kubernetes.io/projected/af394275-eaa5-46bb-a956-97b40d959b18-kube-api-access-9d4gb\") pod \"ingress-operator-5b745b69d9-56zbt\" (UID: \"af394275-eaa5-46bb-a956-97b40d959b18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.455452 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b392476-ce74-4f7f-a12f-920531623ef6-serving-cert\") pod \"authentication-operator-69f744f599-5p2jq\" (UID: \"0b392476-ce74-4f7f-a12f-920531623ef6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.455475 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n89s\" (UniqueName: \"kubernetes.io/projected/d26e3039-7460-49d9-8f89-637d57601639-kube-api-access-8n89s\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:11 crc kubenswrapper[4776]: E0128 06:51:11.455978 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:11.955949332 +0000 UTC m=+43.371609502 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.456837 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-audit-policies\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.458365 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/39ece0d0-d290-4488-9111-f4784bebc3b2-image-import-ca\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.459534 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-service-ca\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.455498 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55effd60-d9e1-4104-ac9a-2ed1d9c7e31b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lft68\" (UID: \"55effd60-d9e1-4104-ac9a-2ed1d9c7e31b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lft68" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.459678 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/29a5bcd2-ab98-42f3-b1f8-0aca08cc1552-srv-cert\") pod \"catalog-operator-68c6474976-hs6h4\" (UID: \"29a5bcd2-ab98-42f3-b1f8-0aca08cc1552\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.459729 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txjcm\" (UniqueName: \"kubernetes.io/projected/55effd60-d9e1-4104-ac9a-2ed1d9c7e31b-kube-api-access-txjcm\") pod \"multus-admission-controller-857f4d67dd-lft68\" (UID: \"55effd60-d9e1-4104-ac9a-2ed1d9c7e31b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lft68" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.459786 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d298bb7-36e2-4d16-97f5-0ee37018f44a-config-volume\") pod \"dns-default-pkbgp\" (UID: \"2d298bb7-36e2-4d16-97f5-0ee37018f44a\") " pod="openshift-dns/dns-default-pkbgp" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.459809 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/692617f2-c85f-42ce-b008-feff57211b45-proxy-tls\") pod \"machine-config-operator-74547568cd-hxpsz\" (UID: \"692617f2-c85f-42ce-b008-feff57211b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.459929 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db28956e-c117-4203-ba6e-c6eadf3908f7-audit-dir\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.459999 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rnms\" (UniqueName: \"kubernetes.io/projected/29a5bcd2-ab98-42f3-b1f8-0aca08cc1552-kube-api-access-6rnms\") pod \"catalog-operator-68c6474976-hs6h4\" (UID: \"29a5bcd2-ab98-42f3-b1f8-0aca08cc1552\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.460033 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db28956e-c117-4203-ba6e-c6eadf3908f7-audit-dir\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.460070 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57b5f9b9-549e-443e-9fc5-eb377698f57b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.460112 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b392476-ce74-4f7f-a12f-920531623ef6-service-ca-bundle\") pod \"authentication-operator-69f744f599-5p2jq\" (UID: \"0b392476-ce74-4f7f-a12f-920531623ef6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.460492 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr7q7\" (UniqueName: \"kubernetes.io/projected/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-kube-api-access-fr7q7\") pod \"collect-profiles-29493045-67htw\" (UID: \"784215d5-15f7-4ff3-b0b5-f176cc7b14b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.460812 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-trusted-ca-bundle\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.460865 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e9f99f8-da42-4c58-a58f-9ebd2a12fca3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vv9ms\" (UID: \"2e9f99f8-da42-4c58-a58f-9ebd2a12fca3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv9ms" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.460888 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29614791-cdee-451e-b670-ac7f3d34d9bb-serving-cert\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.461056 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b392476-ce74-4f7f-a12f-920531623ef6-service-ca-bundle\") pod \"authentication-operator-69f744f599-5p2jq\" (UID: \"0b392476-ce74-4f7f-a12f-920531623ef6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.461189 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.461228 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dbdea8ef-a044-48ca-bfac-19023c9fb55d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xq6gp\" (UID: \"dbdea8ef-a044-48ca-bfac-19023c9fb55d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462163 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dbdea8ef-a044-48ca-bfac-19023c9fb55d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xq6gp\" (UID: \"dbdea8ef-a044-48ca-bfac-19023c9fb55d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462199 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9ac969b9-e7de-444a-902f-c2117d50769d-signing-cabundle\") pod \"service-ca-9c57cc56f-g4qds\" (UID: \"9ac969b9-e7de-444a-902f-c2117d50769d\") " pod="openshift-service-ca/service-ca-9c57cc56f-g4qds" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462224 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa8b2d7-b79e-4478-89be-bd227f7715b7-config\") pod \"service-ca-operator-777779d784-dr6wg\" (UID: \"5aa8b2d7-b79e-4478-89be-bd227f7715b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dr6wg" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462253 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dbdea8ef-a044-48ca-bfac-19023c9fb55d-proxy-tls\") pod \"machine-config-controller-84d6567774-xq6gp\" (UID: \"dbdea8ef-a044-48ca-bfac-19023c9fb55d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462259 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f22939da-a96d-4aab-8446-5452654bac1e-serving-cert\") pod \"openshift-config-operator-7777fb866f-g79fj\" (UID: \"f22939da-a96d-4aab-8446-5452654bac1e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g79fj" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462279 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-trusted-ca-bundle\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462307 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d26e3039-7460-49d9-8f89-637d57601639-socket-dir\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462350 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/db28956e-c117-4203-ba6e-c6eadf3908f7-etcd-client\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462388 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462418 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpdbr\" (UniqueName: \"kubernetes.io/projected/9ac969b9-e7de-444a-902f-c2117d50769d-kube-api-access-kpdbr\") pod \"service-ca-9c57cc56f-g4qds\" (UID: \"9ac969b9-e7de-444a-902f-c2117d50769d\") " pod="openshift-service-ca/service-ca-9c57cc56f-g4qds" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462444 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v68k2\" (UniqueName: \"kubernetes.io/projected/0b392476-ce74-4f7f-a12f-920531623ef6-kube-api-access-v68k2\") pod \"authentication-operator-69f744f599-5p2jq\" (UID: \"0b392476-ce74-4f7f-a12f-920531623ef6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462461 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/56d29578-4c86-4128-adda-2fd5398645a5-auth-proxy-config\") pod \"machine-approver-56656f9798-qqqv5\" (UID: \"56d29578-4c86-4128-adda-2fd5398645a5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462480 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngs2k\" (UniqueName: \"kubernetes.io/projected/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-kube-api-access-ngs2k\") pod \"route-controller-manager-6576b87f9c-zcm9b\" (UID: \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462501 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/39ece0d0-d290-4488-9111-f4784bebc3b2-etcd-client\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462536 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-oauth-config\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462573 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39ece0d0-d290-4488-9111-f4784bebc3b2-audit-dir\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462590 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6c5e07b-212c-404f-bfa8-e96c62028a2a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vxslm\" (UID: \"d6c5e07b-212c-404f-bfa8-e96c62028a2a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vxslm" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462607 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d26e3039-7460-49d9-8f89-637d57601639-plugins-dir\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462631 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-registry-tls\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462649 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-serving-cert\") pod \"route-controller-manager-6576b87f9c-zcm9b\" (UID: \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462669 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d298bb7-36e2-4d16-97f5-0ee37018f44a-metrics-tls\") pod \"dns-default-pkbgp\" (UID: \"2d298bb7-36e2-4d16-97f5-0ee37018f44a\") " pod="openshift-dns/dns-default-pkbgp" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462691 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/39ece0d0-d290-4488-9111-f4784bebc3b2-encryption-config\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462712 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57b5f9b9-549e-443e-9fc5-eb377698f57b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462734 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksvxw\" (UniqueName: \"kubernetes.io/projected/2e9f99f8-da42-4c58-a58f-9ebd2a12fca3-kube-api-access-ksvxw\") pod \"cluster-samples-operator-665b6dd947-vv9ms\" (UID: \"2e9f99f8-da42-4c58-a58f-9ebd2a12fca3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv9ms" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462754 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29614791-cdee-451e-b670-ac7f3d34d9bb-config\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462774 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhd7b\" (UniqueName: \"kubernetes.io/projected/1d563f89-2d21-46cb-a830-2d5b7403f7a1-kube-api-access-rhd7b\") pod \"packageserver-d55dfcdfc-vr8bd\" (UID: \"1d563f89-2d21-46cb-a830-2d5b7403f7a1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462793 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39ece0d0-d290-4488-9111-f4784bebc3b2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462813 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b125288e-95aa-474b-9c87-17f11147206f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2pxtc\" (UID: \"b125288e-95aa-474b-9c87-17f11147206f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2pxtc" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462830 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8d8916b5-df67-4492-8c28-8f0d872a4997-srv-cert\") pod \"olm-operator-6b444d44fb-4djdr\" (UID: \"8d8916b5-df67-4492-8c28-8f0d872a4997\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462849 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f7c9ee3-b72d-4af7-998f-cad0df531c31-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fnvp9\" (UID: \"1f7c9ee3-b72d-4af7-998f-cad0df531c31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnvp9" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462873 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/56d29578-4c86-4128-adda-2fd5398645a5-machine-approver-tls\") pod \"machine-approver-56656f9798-qqqv5\" (UID: \"56d29578-4c86-4128-adda-2fd5398645a5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462892 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8d8916b5-df67-4492-8c28-8f0d872a4997-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4djdr\" (UID: \"8d8916b5-df67-4492-8c28-8f0d872a4997\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462915 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mdpz\" (UniqueName: \"kubernetes.io/projected/5aa8b2d7-b79e-4478-89be-bd227f7715b7-kube-api-access-8mdpz\") pod \"service-ca-operator-777779d784-dr6wg\" (UID: \"5aa8b2d7-b79e-4478-89be-bd227f7715b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dr6wg" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462934 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2677aa4a-2578-4da6-aec2-ea5e949c94f7-certs\") pod \"machine-config-server-8b4t7\" (UID: \"2677aa4a-2578-4da6-aec2-ea5e949c94f7\") " pod="openshift-machine-config-operator/machine-config-server-8b4t7" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462954 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db28956e-c117-4203-ba6e-c6eadf3908f7-serving-cert\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462975 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/692617f2-c85f-42ce-b008-feff57211b45-images\") pod \"machine-config-operator-74547568cd-hxpsz\" (UID: \"692617f2-c85f-42ce-b008-feff57211b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.462993 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-config-volume\") pod \"collect-profiles-29493045-67htw\" (UID: \"784215d5-15f7-4ff3-b0b5-f176cc7b14b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463013 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39ece0d0-d290-4488-9111-f4784bebc3b2-serving-cert\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463030 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/29614791-cdee-451e-b670-ac7f3d34d9bb-etcd-service-ca\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463050 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af394275-eaa5-46bb-a956-97b40d959b18-trusted-ca\") pod \"ingress-operator-5b745b69d9-56zbt\" (UID: \"af394275-eaa5-46bb-a956-97b40d959b18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463070 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46t59\" (UniqueName: \"kubernetes.io/projected/97135081-7759-4edc-aa62-514c15190115-kube-api-access-46t59\") pod \"marketplace-operator-79b997595-2vn7f\" (UID: \"97135081-7759-4edc-aa62-514c15190115\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463087 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2677aa4a-2578-4da6-aec2-ea5e949c94f7-node-bootstrap-token\") pod \"machine-config-server-8b4t7\" (UID: \"2677aa4a-2578-4da6-aec2-ea5e949c94f7\") " pod="openshift-machine-config-operator/machine-config-server-8b4t7" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463105 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-config\") pod \"route-controller-manager-6576b87f9c-zcm9b\" (UID: \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463126 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2gx7\" (UniqueName: \"kubernetes.io/projected/071d005d-cd96-4f28-b644-982b0f846135-kube-api-access-l2gx7\") pod \"router-default-5444994796-tcdcf\" (UID: \"071d005d-cd96-4f28-b644-982b0f846135\") " pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463145 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463171 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463188 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/39ece0d0-d290-4488-9111-f4784bebc3b2-node-pullsecrets\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463231 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d563f89-2d21-46cb-a830-2d5b7403f7a1-webhook-cert\") pod \"packageserver-d55dfcdfc-vr8bd\" (UID: \"1d563f89-2d21-46cb-a830-2d5b7403f7a1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463256 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463274 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c5e07b-212c-404f-bfa8-e96c62028a2a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vxslm\" (UID: \"d6c5e07b-212c-404f-bfa8-e96c62028a2a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vxslm" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463295 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-xlbwv\" (UID: \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463329 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d26e3039-7460-49d9-8f89-637d57601639-registration-dir\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463348 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071d005d-cd96-4f28-b644-982b0f846135-service-ca-bundle\") pod \"router-default-5444994796-tcdcf\" (UID: \"071d005d-cd96-4f28-b644-982b0f846135\") " pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463373 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7c9ee3-b72d-4af7-998f-cad0df531c31-config\") pod \"kube-apiserver-operator-766d6c64bb-fnvp9\" (UID: \"1f7c9ee3-b72d-4af7-998f-cad0df531c31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnvp9" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463413 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b392476-ce74-4f7f-a12f-920531623ef6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5p2jq\" (UID: \"0b392476-ce74-4f7f-a12f-920531623ef6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463432 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/39ece0d0-d290-4488-9111-f4784bebc3b2-etcd-serving-ca\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463450 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1d563f89-2d21-46cb-a830-2d5b7403f7a1-tmpfs\") pod \"packageserver-d55dfcdfc-vr8bd\" (UID: \"1d563f89-2d21-46cb-a830-2d5b7403f7a1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463493 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-config\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463511 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463529 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d563f89-2d21-46cb-a830-2d5b7403f7a1-apiservice-cert\") pod \"packageserver-d55dfcdfc-vr8bd\" (UID: \"1d563f89-2d21-46cb-a830-2d5b7403f7a1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463534 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e9f99f8-da42-4c58-a58f-9ebd2a12fca3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vv9ms\" (UID: \"2e9f99f8-da42-4c58-a58f-9ebd2a12fca3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv9ms" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463561 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7jvr\" (UniqueName: \"kubernetes.io/projected/db28956e-c117-4203-ba6e-c6eadf3908f7-kube-api-access-h7jvr\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463581 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/692617f2-c85f-42ce-b008-feff57211b45-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hxpsz\" (UID: \"692617f2-c85f-42ce-b008-feff57211b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463586 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57b5f9b9-549e-443e-9fc5-eb377698f57b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463624 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-bound-sa-token\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463652 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9ac969b9-e7de-444a-902f-c2117d50769d-signing-key\") pod \"service-ca-9c57cc56f-g4qds\" (UID: \"9ac969b9-e7de-444a-902f-c2117d50769d\") " pod="openshift-service-ca/service-ca-9c57cc56f-g4qds" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.463811 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.464837 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/29614791-cdee-451e-b670-ac7f3d34d9bb-etcd-ca\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.464865 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56d29578-4c86-4128-adda-2fd5398645a5-config\") pod \"machine-approver-56656f9798-qqqv5\" (UID: \"56d29578-4c86-4128-adda-2fd5398645a5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.464885 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fe013e6-ff17-415a-af5a-c96be0fa82e6-config\") pod \"kube-controller-manager-operator-78b949d7b-m58q2\" (UID: \"0fe013e6-ff17-415a-af5a-c96be0fa82e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m58q2" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.464906 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b125288e-95aa-474b-9c87-17f11147206f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2pxtc\" (UID: \"b125288e-95aa-474b-9c87-17f11147206f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2pxtc" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.464912 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29614791-cdee-451e-b670-ac7f3d34d9bb-serving-cert\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.464928 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/90d1a64c-a8ef-4af9-a3ce-fa6357b570d7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-sznv4\" (UID: \"90d1a64c-a8ef-4af9-a3ce-fa6357b570d7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sznv4" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.464930 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/29614791-cdee-451e-b670-ac7f3d34d9bb-etcd-service-ca\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.464954 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.464976 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af394275-eaa5-46bb-a956-97b40d959b18-metrics-tls\") pod \"ingress-operator-5b745b69d9-56zbt\" (UID: \"af394275-eaa5-46bb-a956-97b40d959b18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.464998 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-serving-cert\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.465081 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-config\") pod \"route-controller-manager-6576b87f9c-zcm9b\" (UID: \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.465274 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57b5f9b9-549e-443e-9fc5-eb377698f57b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.465632 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/db28956e-c117-4203-ba6e-c6eadf3908f7-etcd-client\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.465676 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39ece0d0-d290-4488-9111-f4784bebc3b2-audit-dir\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.465761 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-client-ca\") pod \"route-controller-manager-6576b87f9c-zcm9b\" (UID: \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.465785 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-xlbwv\" (UID: \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.465818 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/071d005d-cd96-4f28-b644-982b0f846135-default-certificate\") pod \"router-default-5444994796-tcdcf\" (UID: \"071d005d-cd96-4f28-b644-982b0f846135\") " pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.465854 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm97q\" (UniqueName: \"kubernetes.io/projected/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-kube-api-access-lm97q\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.465876 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c673c63-29c5-42eb-a59a-1350e12bffd7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-r48r6\" (UID: \"9c673c63-29c5-42eb-a59a-1350e12bffd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.465896 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66v5v\" (UniqueName: \"kubernetes.io/projected/9c673c63-29c5-42eb-a59a-1350e12bffd7-kube-api-access-66v5v\") pod \"cluster-image-registry-operator-dc59b4c8b-r48r6\" (UID: \"9c673c63-29c5-42eb-a59a-1350e12bffd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.465916 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqr8x\" (UniqueName: \"kubernetes.io/projected/39ece0d0-d290-4488-9111-f4784bebc3b2-kube-api-access-jqr8x\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.465925 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dbdea8ef-a044-48ca-bfac-19023c9fb55d-proxy-tls\") pod \"machine-config-controller-84d6567774-xq6gp\" (UID: \"dbdea8ef-a044-48ca-bfac-19023c9fb55d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.465935 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d26e3039-7460-49d9-8f89-637d57601639-mountpoint-dir\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.465990 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.466013 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z28ng\" (UniqueName: \"kubernetes.io/projected/29614791-cdee-451e-b670-ac7f3d34d9bb-kube-api-access-z28ng\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.466044 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnddl\" (UniqueName: \"kubernetes.io/projected/2677aa4a-2578-4da6-aec2-ea5e949c94f7-kube-api-access-lnddl\") pod \"machine-config-server-8b4t7\" (UID: \"2677aa4a-2578-4da6-aec2-ea5e949c94f7\") " pod="openshift-machine-config-operator/machine-config-server-8b4t7" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.466085 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b392476-ce74-4f7f-a12f-920531623ef6-config\") pod \"authentication-operator-69f744f599-5p2jq\" (UID: \"0b392476-ce74-4f7f-a12f-920531623ef6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.466104 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/39ece0d0-d290-4488-9111-f4784bebc3b2-audit\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.466149 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd9sg\" (UniqueName: \"kubernetes.io/projected/90d1a64c-a8ef-4af9-a3ce-fa6357b570d7-kube-api-access-hd9sg\") pod \"package-server-manager-789f6589d5-sznv4\" (UID: \"90d1a64c-a8ef-4af9-a3ce-fa6357b570d7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sznv4" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.466570 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b392476-ce74-4f7f-a12f-920531623ef6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5p2jq\" (UID: \"0b392476-ce74-4f7f-a12f-920531623ef6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.466709 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.467068 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db28956e-c117-4203-ba6e-c6eadf3908f7-serving-cert\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.467468 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b392476-ce74-4f7f-a12f-920531623ef6-config\") pod \"authentication-operator-69f744f599-5p2jq\" (UID: \"0b392476-ce74-4f7f-a12f-920531623ef6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.467467 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b392476-ce74-4f7f-a12f-920531623ef6-serving-cert\") pod \"authentication-operator-69f744f599-5p2jq\" (UID: \"0b392476-ce74-4f7f-a12f-920531623ef6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.467596 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/56d29578-4c86-4128-adda-2fd5398645a5-auth-proxy-config\") pod \"machine-approver-56656f9798-qqqv5\" (UID: \"56d29578-4c86-4128-adda-2fd5398645a5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.467729 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/56d29578-4c86-4128-adda-2fd5398645a5-machine-approver-tls\") pod \"machine-approver-56656f9798-qqqv5\" (UID: \"56d29578-4c86-4128-adda-2fd5398645a5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.467922 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/39ece0d0-d290-4488-9111-f4784bebc3b2-etcd-serving-ca\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.467971 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/39ece0d0-d290-4488-9111-f4784bebc3b2-node-pullsecrets\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468041 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/39ece0d0-d290-4488-9111-f4784bebc3b2-audit\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468085 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fe013e6-ff17-415a-af5a-c96be0fa82e6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m58q2\" (UID: \"0fe013e6-ff17-415a-af5a-c96be0fa82e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m58q2" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468106 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c5e07b-212c-404f-bfa8-e96c62028a2a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vxslm\" (UID: \"d6c5e07b-212c-404f-bfa8-e96c62028a2a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vxslm" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468128 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c673c63-29c5-42eb-a59a-1350e12bffd7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-r48r6\" (UID: \"9c673c63-29c5-42eb-a59a-1350e12bffd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468165 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af394275-eaa5-46bb-a956-97b40d959b18-bound-sa-token\") pod \"ingress-operator-5b745b69d9-56zbt\" (UID: \"af394275-eaa5-46bb-a956-97b40d959b18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468186 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fba465ce-f898-4cca-b8f1-6281aef02eb7-cert\") pod \"ingress-canary-2znv8\" (UID: \"fba465ce-f898-4cca-b8f1-6281aef02eb7\") " pod="openshift-ingress-canary/ingress-canary-2znv8" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468224 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/43ce9486-c553-4d64-92fb-20402352c29f-audit-dir\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468242 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp8r6\" (UniqueName: \"kubernetes.io/projected/b125288e-95aa-474b-9c87-17f11147206f-kube-api-access-mp8r6\") pod \"kube-storage-version-migrator-operator-b67b599dd-2pxtc\" (UID: \"b125288e-95aa-474b-9c87-17f11147206f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2pxtc" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468263 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97135081-7759-4edc-aa62-514c15190115-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2vn7f\" (UID: \"97135081-7759-4edc-aa62-514c15190115\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468297 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db28956e-c117-4203-ba6e-c6eadf3908f7-audit-policies\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468461 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fe013e6-ff17-415a-af5a-c96be0fa82e6-config\") pod \"kube-controller-manager-operator-78b949d7b-m58q2\" (UID: \"0fe013e6-ff17-415a-af5a-c96be0fa82e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m58q2" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468676 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8386b67-8be2-4d18-9358-fccd65c363db-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5bxjg\" (UID: \"a8386b67-8be2-4d18-9358-fccd65c363db\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5bxjg" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468725 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468752 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f22939da-a96d-4aab-8446-5452654bac1e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-g79fj\" (UID: \"f22939da-a96d-4aab-8446-5452654bac1e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g79fj" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468773 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aa8b2d7-b79e-4478-89be-bd227f7715b7-serving-cert\") pod \"service-ca-operator-777779d784-dr6wg\" (UID: \"5aa8b2d7-b79e-4478-89be-bd227f7715b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dr6wg" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468816 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/29a5bcd2-ab98-42f3-b1f8-0aca08cc1552-profile-collector-cert\") pod \"catalog-operator-68c6474976-hs6h4\" (UID: \"29a5bcd2-ab98-42f3-b1f8-0aca08cc1552\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468848 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57b5f9b9-549e-443e-9fc5-eb377698f57b-trusted-ca\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468867 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/39ece0d0-d290-4488-9111-f4784bebc3b2-etcd-client\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468873 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/db28956e-c117-4203-ba6e-c6eadf3908f7-encryption-config\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468933 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5c9g\" (UniqueName: \"kubernetes.io/projected/fba465ce-f898-4cca-b8f1-6281aef02eb7-kube-api-access-z5c9g\") pod \"ingress-canary-2znv8\" (UID: \"fba465ce-f898-4cca-b8f1-6281aef02eb7\") " pod="openshift-ingress-canary/ingress-canary-2znv8" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468959 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-ready\") pod \"cni-sysctl-allowlist-ds-xlbwv\" (UID: \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.468992 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57b5f9b9-549e-443e-9fc5-eb377698f57b-registry-certificates\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469020 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469043 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-secret-volume\") pod \"collect-profiles-29493045-67htw\" (UID: \"784215d5-15f7-4ff3-b0b5-f176cc7b14b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469078 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469102 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c673c63-29c5-42eb-a59a-1350e12bffd7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-r48r6\" (UID: \"9c673c63-29c5-42eb-a59a-1350e12bffd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469127 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsv9w\" (UniqueName: \"kubernetes.io/projected/ea1a3f88-1cd5-44af-8f5e-5713cd39b4d6-kube-api-access-dsv9w\") pod \"migrator-59844c95c7-h4g2z\" (UID: \"ea1a3f88-1cd5-44af-8f5e-5713cd39b4d6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-h4g2z" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469156 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw2wp\" (UniqueName: \"kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-kube-api-access-fw2wp\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469175 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q8s6\" (UniqueName: \"kubernetes.io/projected/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-kube-api-access-5q8s6\") pod \"cni-sysctl-allowlist-ds-xlbwv\" (UID: \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469202 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74ck7\" (UniqueName: \"kubernetes.io/projected/43ce9486-c553-4d64-92fb-20402352c29f-kube-api-access-74ck7\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469221 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr76r\" (UniqueName: \"kubernetes.io/projected/2d298bb7-36e2-4d16-97f5-0ee37018f44a-kube-api-access-gr76r\") pod \"dns-default-pkbgp\" (UID: \"2d298bb7-36e2-4d16-97f5-0ee37018f44a\") " pod="openshift-dns/dns-default-pkbgp" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469240 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/071d005d-cd96-4f28-b644-982b0f846135-metrics-certs\") pod \"router-default-5444994796-tcdcf\" (UID: \"071d005d-cd96-4f28-b644-982b0f846135\") " pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469276 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-oauth-serving-cert\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469448 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/db28956e-c117-4203-ba6e-c6eadf3908f7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469469 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzp4b\" (UniqueName: \"kubernetes.io/projected/692617f2-c85f-42ce-b008-feff57211b45-kube-api-access-zzp4b\") pod \"machine-config-operator-74547568cd-hxpsz\" (UID: \"692617f2-c85f-42ce-b008-feff57211b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469496 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469521 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97135081-7759-4edc-aa62-514c15190115-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2vn7f\" (UID: \"97135081-7759-4edc-aa62-514c15190115\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469562 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db28956e-c117-4203-ba6e-c6eadf3908f7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469582 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/29614791-cdee-451e-b670-ac7f3d34d9bb-etcd-client\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469613 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39ece0d0-d290-4488-9111-f4784bebc3b2-config\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469635 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df8z6\" (UniqueName: \"kubernetes.io/projected/a8386b67-8be2-4d18-9358-fccd65c363db-kube-api-access-df8z6\") pod \"control-plane-machine-set-operator-78cbb6b69f-5bxjg\" (UID: \"a8386b67-8be2-4d18-9358-fccd65c363db\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5bxjg" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.469793 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56d29578-4c86-4128-adda-2fd5398645a5-config\") pod \"machine-approver-56656f9798-qqqv5\" (UID: \"56d29578-4c86-4128-adda-2fd5398645a5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.470379 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-oauth-serving-cert\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.470697 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be5bb707-a7a1-4b88-a75e-0093c14a4764-serving-cert\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.470880 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.471409 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.471431 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57b5f9b9-549e-443e-9fc5-eb377698f57b-registry-certificates\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.471594 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db28956e-c117-4203-ba6e-c6eadf3908f7-audit-policies\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.473448 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29614791-cdee-451e-b670-ac7f3d34d9bb-config\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.473980 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/43ce9486-c553-4d64-92fb-20402352c29f-audit-dir\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.474181 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39ece0d0-d290-4488-9111-f4784bebc3b2-serving-cert\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.474567 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.474644 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-oauth-config\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.474673 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-client-ca\") pod \"route-controller-manager-6576b87f9c-zcm9b\" (UID: \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.475133 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.475140 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/39ece0d0-d290-4488-9111-f4784bebc3b2-encryption-config\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.475665 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db28956e-c117-4203-ba6e-c6eadf3908f7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.475820 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.476159 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39ece0d0-d290-4488-9111-f4784bebc3b2-config\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.476935 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/db28956e-c117-4203-ba6e-c6eadf3908f7-encryption-config\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.477171 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c673c63-29c5-42eb-a59a-1350e12bffd7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-r48r6\" (UID: \"9c673c63-29c5-42eb-a59a-1350e12bffd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.477413 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c5e07b-212c-404f-bfa8-e96c62028a2a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vxslm\" (UID: \"d6c5e07b-212c-404f-bfa8-e96c62028a2a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vxslm" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.477506 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-registry-tls\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.477749 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-serving-cert\") pod \"route-controller-manager-6576b87f9c-zcm9b\" (UID: \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.478052 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f22939da-a96d-4aab-8446-5452654bac1e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-g79fj\" (UID: \"f22939da-a96d-4aab-8446-5452654bac1e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g79fj" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.482483 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/29614791-cdee-451e-b670-ac7f3d34d9bb-etcd-ca\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.482951 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c5e07b-212c-404f-bfa8-e96c62028a2a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vxslm\" (UID: \"d6c5e07b-212c-404f-bfa8-e96c62028a2a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vxslm" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.482987 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57b5f9b9-549e-443e-9fc5-eb377698f57b-trusted-ca\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.483036 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.483077 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c673c63-29c5-42eb-a59a-1350e12bffd7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-r48r6\" (UID: \"9c673c63-29c5-42eb-a59a-1350e12bffd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.483532 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-config\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.484170 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.484251 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.484313 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/db28956e-c117-4203-ba6e-c6eadf3908f7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: E0128 06:51:11.484752 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:11.984719526 +0000 UTC m=+43.400379686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.484754 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39ece0d0-d290-4488-9111-f4784bebc3b2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.485145 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fe013e6-ff17-415a-af5a-c96be0fa82e6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m58q2\" (UID: \"0fe013e6-ff17-415a-af5a-c96be0fa82e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m58q2" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.485722 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-serving-cert\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.486068 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/29614791-cdee-451e-b670-ac7f3d34d9bb-etcd-client\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.486966 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.488180 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwz2q" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.490212 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.493899 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xm8h7"] Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.497158 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.503397 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.505025 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-client-ca\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.513399 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nltsm"] Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.517172 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fz854" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.519793 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.523272 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d5hf2" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.524377 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k8nsq"] Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.527140 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clfbw\" (UniqueName: \"kubernetes.io/projected/be5bb707-a7a1-4b88-a75e-0093c14a4764-kube-api-access-clfbw\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.541728 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 06:51:11 crc kubenswrapper[4776]: W0128 06:51:11.547672 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod273d91f5_04f4_44b2_8ccf_843e85ea7c7b.slice/crio-9ea20cdfc9df7f75a82ce2bc6bb981f61938bef3a7f7ec17218108ffa706055b WatchSource:0}: Error finding container 9ea20cdfc9df7f75a82ce2bc6bb981f61938bef3a7f7ec17218108ffa706055b: Status 404 returned error can't find the container with id 9ea20cdfc9df7f75a82ce2bc6bb981f61938bef3a7f7ec17218108ffa706055b Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.566769 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.570703 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:11 crc kubenswrapper[4776]: E0128 06:51:11.570851 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:12.070829114 +0000 UTC m=+43.486489264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571035 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6njzh\" (UniqueName: \"kubernetes.io/projected/8d8916b5-df67-4492-8c28-8f0d872a4997-kube-api-access-6njzh\") pod \"olm-operator-6b444d44fb-4djdr\" (UID: \"8d8916b5-df67-4492-8c28-8f0d872a4997\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571069 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d4gb\" (UniqueName: \"kubernetes.io/projected/af394275-eaa5-46bb-a956-97b40d959b18-kube-api-access-9d4gb\") pod \"ingress-operator-5b745b69d9-56zbt\" (UID: \"af394275-eaa5-46bb-a956-97b40d959b18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571095 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n89s\" (UniqueName: \"kubernetes.io/projected/d26e3039-7460-49d9-8f89-637d57601639-kube-api-access-8n89s\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571114 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55effd60-d9e1-4104-ac9a-2ed1d9c7e31b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lft68\" (UID: \"55effd60-d9e1-4104-ac9a-2ed1d9c7e31b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lft68" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571134 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/29a5bcd2-ab98-42f3-b1f8-0aca08cc1552-srv-cert\") pod \"catalog-operator-68c6474976-hs6h4\" (UID: \"29a5bcd2-ab98-42f3-b1f8-0aca08cc1552\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571154 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txjcm\" (UniqueName: \"kubernetes.io/projected/55effd60-d9e1-4104-ac9a-2ed1d9c7e31b-kube-api-access-txjcm\") pod \"multus-admission-controller-857f4d67dd-lft68\" (UID: \"55effd60-d9e1-4104-ac9a-2ed1d9c7e31b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lft68" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571193 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d298bb7-36e2-4d16-97f5-0ee37018f44a-config-volume\") pod \"dns-default-pkbgp\" (UID: \"2d298bb7-36e2-4d16-97f5-0ee37018f44a\") " pod="openshift-dns/dns-default-pkbgp" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571213 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/692617f2-c85f-42ce-b008-feff57211b45-proxy-tls\") pod \"machine-config-operator-74547568cd-hxpsz\" (UID: \"692617f2-c85f-42ce-b008-feff57211b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571232 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rnms\" (UniqueName: \"kubernetes.io/projected/29a5bcd2-ab98-42f3-b1f8-0aca08cc1552-kube-api-access-6rnms\") pod \"catalog-operator-68c6474976-hs6h4\" (UID: \"29a5bcd2-ab98-42f3-b1f8-0aca08cc1552\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571256 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr7q7\" (UniqueName: \"kubernetes.io/projected/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-kube-api-access-fr7q7\") pod \"collect-profiles-29493045-67htw\" (UID: \"784215d5-15f7-4ff3-b0b5-f176cc7b14b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571274 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9ac969b9-e7de-444a-902f-c2117d50769d-signing-cabundle\") pod \"service-ca-9c57cc56f-g4qds\" (UID: \"9ac969b9-e7de-444a-902f-c2117d50769d\") " pod="openshift-service-ca/service-ca-9c57cc56f-g4qds" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571296 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa8b2d7-b79e-4478-89be-bd227f7715b7-config\") pod \"service-ca-operator-777779d784-dr6wg\" (UID: \"5aa8b2d7-b79e-4478-89be-bd227f7715b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dr6wg" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571341 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d26e3039-7460-49d9-8f89-637d57601639-socket-dir\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571418 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpdbr\" (UniqueName: \"kubernetes.io/projected/9ac969b9-e7de-444a-902f-c2117d50769d-kube-api-access-kpdbr\") pod \"service-ca-9c57cc56f-g4qds\" (UID: \"9ac969b9-e7de-444a-902f-c2117d50769d\") " pod="openshift-service-ca/service-ca-9c57cc56f-g4qds" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571446 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d26e3039-7460-49d9-8f89-637d57601639-plugins-dir\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571465 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d298bb7-36e2-4d16-97f5-0ee37018f44a-metrics-tls\") pod \"dns-default-pkbgp\" (UID: \"2d298bb7-36e2-4d16-97f5-0ee37018f44a\") " pod="openshift-dns/dns-default-pkbgp" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571499 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhd7b\" (UniqueName: \"kubernetes.io/projected/1d563f89-2d21-46cb-a830-2d5b7403f7a1-kube-api-access-rhd7b\") pod \"packageserver-d55dfcdfc-vr8bd\" (UID: \"1d563f89-2d21-46cb-a830-2d5b7403f7a1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571522 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b125288e-95aa-474b-9c87-17f11147206f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2pxtc\" (UID: \"b125288e-95aa-474b-9c87-17f11147206f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2pxtc" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571539 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8d8916b5-df67-4492-8c28-8f0d872a4997-srv-cert\") pod \"olm-operator-6b444d44fb-4djdr\" (UID: \"8d8916b5-df67-4492-8c28-8f0d872a4997\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571764 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8d8916b5-df67-4492-8c28-8f0d872a4997-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4djdr\" (UID: \"8d8916b5-df67-4492-8c28-8f0d872a4997\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571788 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mdpz\" (UniqueName: \"kubernetes.io/projected/5aa8b2d7-b79e-4478-89be-bd227f7715b7-kube-api-access-8mdpz\") pod \"service-ca-operator-777779d784-dr6wg\" (UID: \"5aa8b2d7-b79e-4478-89be-bd227f7715b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dr6wg" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571840 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f7c9ee3-b72d-4af7-998f-cad0df531c31-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fnvp9\" (UID: \"1f7c9ee3-b72d-4af7-998f-cad0df531c31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnvp9" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571871 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2677aa4a-2578-4da6-aec2-ea5e949c94f7-certs\") pod \"machine-config-server-8b4t7\" (UID: \"2677aa4a-2578-4da6-aec2-ea5e949c94f7\") " pod="openshift-machine-config-operator/machine-config-server-8b4t7" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571884 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d26e3039-7460-49d9-8f89-637d57601639-socket-dir\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571894 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/692617f2-c85f-42ce-b008-feff57211b45-images\") pod \"machine-config-operator-74547568cd-hxpsz\" (UID: \"692617f2-c85f-42ce-b008-feff57211b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571955 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-config-volume\") pod \"collect-profiles-29493045-67htw\" (UID: \"784215d5-15f7-4ff3-b0b5-f176cc7b14b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571987 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af394275-eaa5-46bb-a956-97b40d959b18-trusted-ca\") pod \"ingress-operator-5b745b69d9-56zbt\" (UID: \"af394275-eaa5-46bb-a956-97b40d959b18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.571998 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d26e3039-7460-49d9-8f89-637d57601639-plugins-dir\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572010 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46t59\" (UniqueName: \"kubernetes.io/projected/97135081-7759-4edc-aa62-514c15190115-kube-api-access-46t59\") pod \"marketplace-operator-79b997595-2vn7f\" (UID: \"97135081-7759-4edc-aa62-514c15190115\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572092 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2677aa4a-2578-4da6-aec2-ea5e949c94f7-node-bootstrap-token\") pod \"machine-config-server-8b4t7\" (UID: \"2677aa4a-2578-4da6-aec2-ea5e949c94f7\") " pod="openshift-machine-config-operator/machine-config-server-8b4t7" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572124 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2gx7\" (UniqueName: \"kubernetes.io/projected/071d005d-cd96-4f28-b644-982b0f846135-kube-api-access-l2gx7\") pod \"router-default-5444994796-tcdcf\" (UID: \"071d005d-cd96-4f28-b644-982b0f846135\") " pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572173 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d563f89-2d21-46cb-a830-2d5b7403f7a1-webhook-cert\") pod \"packageserver-d55dfcdfc-vr8bd\" (UID: \"1d563f89-2d21-46cb-a830-2d5b7403f7a1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572199 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-xlbwv\" (UID: \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572206 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d298bb7-36e2-4d16-97f5-0ee37018f44a-config-volume\") pod \"dns-default-pkbgp\" (UID: \"2d298bb7-36e2-4d16-97f5-0ee37018f44a\") " pod="openshift-dns/dns-default-pkbgp" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572251 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d26e3039-7460-49d9-8f89-637d57601639-registration-dir\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572272 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071d005d-cd96-4f28-b644-982b0f846135-service-ca-bundle\") pod \"router-default-5444994796-tcdcf\" (UID: \"071d005d-cd96-4f28-b644-982b0f846135\") " pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572311 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7c9ee3-b72d-4af7-998f-cad0df531c31-config\") pod \"kube-apiserver-operator-766d6c64bb-fnvp9\" (UID: \"1f7c9ee3-b72d-4af7-998f-cad0df531c31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnvp9" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572343 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1d563f89-2d21-46cb-a830-2d5b7403f7a1-tmpfs\") pod \"packageserver-d55dfcdfc-vr8bd\" (UID: \"1d563f89-2d21-46cb-a830-2d5b7403f7a1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572406 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d563f89-2d21-46cb-a830-2d5b7403f7a1-apiservice-cert\") pod \"packageserver-d55dfcdfc-vr8bd\" (UID: \"1d563f89-2d21-46cb-a830-2d5b7403f7a1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572441 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/692617f2-c85f-42ce-b008-feff57211b45-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hxpsz\" (UID: \"692617f2-c85f-42ce-b008-feff57211b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572495 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9ac969b9-e7de-444a-902f-c2117d50769d-signing-key\") pod \"service-ca-9c57cc56f-g4qds\" (UID: \"9ac969b9-e7de-444a-902f-c2117d50769d\") " pod="openshift-service-ca/service-ca-9c57cc56f-g4qds" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572521 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b125288e-95aa-474b-9c87-17f11147206f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2pxtc\" (UID: \"b125288e-95aa-474b-9c87-17f11147206f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2pxtc" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572575 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/90d1a64c-a8ef-4af9-a3ce-fa6357b570d7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-sznv4\" (UID: \"90d1a64c-a8ef-4af9-a3ce-fa6357b570d7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sznv4" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572604 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af394275-eaa5-46bb-a956-97b40d959b18-metrics-tls\") pod \"ingress-operator-5b745b69d9-56zbt\" (UID: \"af394275-eaa5-46bb-a956-97b40d959b18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572646 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-xlbwv\" (UID: \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572668 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/071d005d-cd96-4f28-b644-982b0f846135-default-certificate\") pod \"router-default-5444994796-tcdcf\" (UID: \"071d005d-cd96-4f28-b644-982b0f846135\") " pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572719 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d26e3039-7460-49d9-8f89-637d57601639-mountpoint-dir\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572772 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnddl\" (UniqueName: \"kubernetes.io/projected/2677aa4a-2578-4da6-aec2-ea5e949c94f7-kube-api-access-lnddl\") pod \"machine-config-server-8b4t7\" (UID: \"2677aa4a-2578-4da6-aec2-ea5e949c94f7\") " pod="openshift-machine-config-operator/machine-config-server-8b4t7" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572798 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd9sg\" (UniqueName: \"kubernetes.io/projected/90d1a64c-a8ef-4af9-a3ce-fa6357b570d7-kube-api-access-hd9sg\") pod \"package-server-manager-789f6589d5-sznv4\" (UID: \"90d1a64c-a8ef-4af9-a3ce-fa6357b570d7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sznv4" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572825 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af394275-eaa5-46bb-a956-97b40d959b18-bound-sa-token\") pod \"ingress-operator-5b745b69d9-56zbt\" (UID: \"af394275-eaa5-46bb-a956-97b40d959b18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572843 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fba465ce-f898-4cca-b8f1-6281aef02eb7-cert\") pod \"ingress-canary-2znv8\" (UID: \"fba465ce-f898-4cca-b8f1-6281aef02eb7\") " pod="openshift-ingress-canary/ingress-canary-2znv8" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572863 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp8r6\" (UniqueName: \"kubernetes.io/projected/b125288e-95aa-474b-9c87-17f11147206f-kube-api-access-mp8r6\") pod \"kube-storage-version-migrator-operator-b67b599dd-2pxtc\" (UID: \"b125288e-95aa-474b-9c87-17f11147206f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2pxtc" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572883 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97135081-7759-4edc-aa62-514c15190115-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2vn7f\" (UID: \"97135081-7759-4edc-aa62-514c15190115\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572908 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8386b67-8be2-4d18-9358-fccd65c363db-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5bxjg\" (UID: \"a8386b67-8be2-4d18-9358-fccd65c363db\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5bxjg" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572933 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aa8b2d7-b79e-4478-89be-bd227f7715b7-serving-cert\") pod \"service-ca-operator-777779d784-dr6wg\" (UID: \"5aa8b2d7-b79e-4478-89be-bd227f7715b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dr6wg" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572964 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/29a5bcd2-ab98-42f3-b1f8-0aca08cc1552-profile-collector-cert\") pod \"catalog-operator-68c6474976-hs6h4\" (UID: \"29a5bcd2-ab98-42f3-b1f8-0aca08cc1552\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.572995 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5c9g\" (UniqueName: \"kubernetes.io/projected/fba465ce-f898-4cca-b8f1-6281aef02eb7-kube-api-access-z5c9g\") pod \"ingress-canary-2znv8\" (UID: \"fba465ce-f898-4cca-b8f1-6281aef02eb7\") " pod="openshift-ingress-canary/ingress-canary-2znv8" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.573013 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-ready\") pod \"cni-sysctl-allowlist-ds-xlbwv\" (UID: \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.573036 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-secret-volume\") pod \"collect-profiles-29493045-67htw\" (UID: \"784215d5-15f7-4ff3-b0b5-f176cc7b14b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.573057 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.573076 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsv9w\" (UniqueName: \"kubernetes.io/projected/ea1a3f88-1cd5-44af-8f5e-5713cd39b4d6-kube-api-access-dsv9w\") pod \"migrator-59844c95c7-h4g2z\" (UID: \"ea1a3f88-1cd5-44af-8f5e-5713cd39b4d6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-h4g2z" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.573102 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q8s6\" (UniqueName: \"kubernetes.io/projected/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-kube-api-access-5q8s6\") pod \"cni-sysctl-allowlist-ds-xlbwv\" (UID: \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.573133 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr76r\" (UniqueName: \"kubernetes.io/projected/2d298bb7-36e2-4d16-97f5-0ee37018f44a-kube-api-access-gr76r\") pod \"dns-default-pkbgp\" (UID: \"2d298bb7-36e2-4d16-97f5-0ee37018f44a\") " pod="openshift-dns/dns-default-pkbgp" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.573154 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/071d005d-cd96-4f28-b644-982b0f846135-metrics-certs\") pod \"router-default-5444994796-tcdcf\" (UID: \"071d005d-cd96-4f28-b644-982b0f846135\") " pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.573175 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzp4b\" (UniqueName: \"kubernetes.io/projected/692617f2-c85f-42ce-b008-feff57211b45-kube-api-access-zzp4b\") pod \"machine-config-operator-74547568cd-hxpsz\" (UID: \"692617f2-c85f-42ce-b008-feff57211b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.573196 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97135081-7759-4edc-aa62-514c15190115-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2vn7f\" (UID: \"97135081-7759-4edc-aa62-514c15190115\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.573226 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df8z6\" (UniqueName: \"kubernetes.io/projected/a8386b67-8be2-4d18-9358-fccd65c363db-kube-api-access-df8z6\") pod \"control-plane-machine-set-operator-78cbb6b69f-5bxjg\" (UID: \"a8386b67-8be2-4d18-9358-fccd65c363db\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5bxjg" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.573249 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/071d005d-cd96-4f28-b644-982b0f846135-stats-auth\") pod \"router-default-5444994796-tcdcf\" (UID: \"071d005d-cd96-4f28-b644-982b0f846135\") " pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.573281 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d26e3039-7460-49d9-8f89-637d57601639-csi-data-dir\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.573313 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1f7c9ee3-b72d-4af7-998f-cad0df531c31-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fnvp9\" (UID: \"1f7c9ee3-b72d-4af7-998f-cad0df531c31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnvp9" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.574106 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9ac969b9-e7de-444a-902f-c2117d50769d-signing-cabundle\") pod \"service-ca-9c57cc56f-g4qds\" (UID: \"9ac969b9-e7de-444a-902f-c2117d50769d\") " pod="openshift-service-ca/service-ca-9c57cc56f-g4qds" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.574712 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa8b2d7-b79e-4478-89be-bd227f7715b7-config\") pod \"service-ca-operator-777779d784-dr6wg\" (UID: \"5aa8b2d7-b79e-4478-89be-bd227f7715b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dr6wg" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.575331 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/692617f2-c85f-42ce-b008-feff57211b45-images\") pod \"machine-config-operator-74547568cd-hxpsz\" (UID: \"692617f2-c85f-42ce-b008-feff57211b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.575424 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-config-volume\") pod \"collect-profiles-29493045-67htw\" (UID: \"784215d5-15f7-4ff3-b0b5-f176cc7b14b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.575887 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d298bb7-36e2-4d16-97f5-0ee37018f44a-metrics-tls\") pod \"dns-default-pkbgp\" (UID: \"2d298bb7-36e2-4d16-97f5-0ee37018f44a\") " pod="openshift-dns/dns-default-pkbgp" Jan 28 06:51:11 crc kubenswrapper[4776]: E0128 06:51:11.575996 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:12.07598696 +0000 UTC m=+43.491647120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.582604 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-ready\") pod \"cni-sysctl-allowlist-ds-xlbwv\" (UID: \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.583019 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9fmw6\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.583272 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55effd60-d9e1-4104-ac9a-2ed1d9c7e31b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lft68\" (UID: \"55effd60-d9e1-4104-ac9a-2ed1d9c7e31b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lft68" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.583737 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2677aa4a-2578-4da6-aec2-ea5e949c94f7-certs\") pod \"machine-config-server-8b4t7\" (UID: \"2677aa4a-2578-4da6-aec2-ea5e949c94f7\") " pod="openshift-machine-config-operator/machine-config-server-8b4t7" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.583826 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d26e3039-7460-49d9-8f89-637d57601639-registration-dir\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.584043 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/29a5bcd2-ab98-42f3-b1f8-0aca08cc1552-srv-cert\") pod \"catalog-operator-68c6474976-hs6h4\" (UID: \"29a5bcd2-ab98-42f3-b1f8-0aca08cc1552\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.584086 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/692617f2-c85f-42ce-b008-feff57211b45-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hxpsz\" (UID: \"692617f2-c85f-42ce-b008-feff57211b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.585256 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1d563f89-2d21-46cb-a830-2d5b7403f7a1-tmpfs\") pod \"packageserver-d55dfcdfc-vr8bd\" (UID: \"1d563f89-2d21-46cb-a830-2d5b7403f7a1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.585946 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-xlbwv\" (UID: \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.586040 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d26e3039-7460-49d9-8f89-637d57601639-mountpoint-dir\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.586068 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d26e3039-7460-49d9-8f89-637d57601639-csi-data-dir\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.586080 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-xlbwv\" (UID: \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.586535 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7c9ee3-b72d-4af7-998f-cad0df531c31-config\") pod \"kube-apiserver-operator-766d6c64bb-fnvp9\" (UID: \"1f7c9ee3-b72d-4af7-998f-cad0df531c31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnvp9" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.586613 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071d005d-cd96-4f28-b644-982b0f846135-service-ca-bundle\") pod \"router-default-5444994796-tcdcf\" (UID: \"071d005d-cd96-4f28-b644-982b0f846135\") " pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.587835 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b125288e-95aa-474b-9c87-17f11147206f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2pxtc\" (UID: \"b125288e-95aa-474b-9c87-17f11147206f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2pxtc" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.587850 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97135081-7759-4edc-aa62-514c15190115-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2vn7f\" (UID: \"97135081-7759-4edc-aa62-514c15190115\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.588020 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/692617f2-c85f-42ce-b008-feff57211b45-proxy-tls\") pod \"machine-config-operator-74547568cd-hxpsz\" (UID: \"692617f2-c85f-42ce-b008-feff57211b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.589466 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af394275-eaa5-46bb-a956-97b40d959b18-trusted-ca\") pod \"ingress-operator-5b745b69d9-56zbt\" (UID: \"af394275-eaa5-46bb-a956-97b40d959b18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.591372 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d563f89-2d21-46cb-a830-2d5b7403f7a1-webhook-cert\") pod \"packageserver-d55dfcdfc-vr8bd\" (UID: \"1d563f89-2d21-46cb-a830-2d5b7403f7a1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.591368 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b125288e-95aa-474b-9c87-17f11147206f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2pxtc\" (UID: \"b125288e-95aa-474b-9c87-17f11147206f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2pxtc" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.591438 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8d8916b5-df67-4492-8c28-8f0d872a4997-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4djdr\" (UID: \"8d8916b5-df67-4492-8c28-8f0d872a4997\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.591758 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f7c9ee3-b72d-4af7-998f-cad0df531c31-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fnvp9\" (UID: \"1f7c9ee3-b72d-4af7-998f-cad0df531c31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnvp9" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.592419 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aa8b2d7-b79e-4478-89be-bd227f7715b7-serving-cert\") pod \"service-ca-operator-777779d784-dr6wg\" (UID: \"5aa8b2d7-b79e-4478-89be-bd227f7715b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dr6wg" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.593690 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9ac969b9-e7de-444a-902f-c2117d50769d-signing-key\") pod \"service-ca-9c57cc56f-g4qds\" (UID: \"9ac969b9-e7de-444a-902f-c2117d50769d\") " pod="openshift-service-ca/service-ca-9c57cc56f-g4qds" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.595111 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-secret-volume\") pod \"collect-profiles-29493045-67htw\" (UID: \"784215d5-15f7-4ff3-b0b5-f176cc7b14b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.596683 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/071d005d-cd96-4f28-b644-982b0f846135-default-certificate\") pod \"router-default-5444994796-tcdcf\" (UID: \"071d005d-cd96-4f28-b644-982b0f846135\") " pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.599431 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/90d1a64c-a8ef-4af9-a3ce-fa6357b570d7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-sznv4\" (UID: \"90d1a64c-a8ef-4af9-a3ce-fa6357b570d7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sznv4" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.600281 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2677aa4a-2578-4da6-aec2-ea5e949c94f7-node-bootstrap-token\") pod \"machine-config-server-8b4t7\" (UID: \"2677aa4a-2578-4da6-aec2-ea5e949c94f7\") " pod="openshift-machine-config-operator/machine-config-server-8b4t7" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.600359 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8386b67-8be2-4d18-9358-fccd65c363db-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5bxjg\" (UID: \"a8386b67-8be2-4d18-9358-fccd65c363db\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5bxjg" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.600907 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/071d005d-cd96-4f28-b644-982b0f846135-stats-auth\") pod \"router-default-5444994796-tcdcf\" (UID: \"071d005d-cd96-4f28-b644-982b0f846135\") " pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.600991 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/071d005d-cd96-4f28-b644-982b0f846135-metrics-certs\") pod \"router-default-5444994796-tcdcf\" (UID: \"071d005d-cd96-4f28-b644-982b0f846135\") " pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.600991 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d563f89-2d21-46cb-a830-2d5b7403f7a1-apiservice-cert\") pod \"packageserver-d55dfcdfc-vr8bd\" (UID: \"1d563f89-2d21-46cb-a830-2d5b7403f7a1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.601014 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/29a5bcd2-ab98-42f3-b1f8-0aca08cc1552-profile-collector-cert\") pod \"catalog-operator-68c6474976-hs6h4\" (UID: \"29a5bcd2-ab98-42f3-b1f8-0aca08cc1552\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.601041 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8d8916b5-df67-4492-8c28-8f0d872a4997-srv-cert\") pod \"olm-operator-6b444d44fb-4djdr\" (UID: \"8d8916b5-df67-4492-8c28-8f0d872a4997\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.602323 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fba465ce-f898-4cca-b8f1-6281aef02eb7-cert\") pod \"ingress-canary-2znv8\" (UID: \"fba465ce-f898-4cca-b8f1-6281aef02eb7\") " pod="openshift-ingress-canary/ingress-canary-2znv8" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.605041 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97135081-7759-4edc-aa62-514c15190115-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2vn7f\" (UID: \"97135081-7759-4edc-aa62-514c15190115\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.616437 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af394275-eaa5-46bb-a956-97b40d959b18-metrics-tls\") pod \"ingress-operator-5b745b69d9-56zbt\" (UID: \"af394275-eaa5-46bb-a956-97b40d959b18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.619972 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fe013e6-ff17-415a-af5a-c96be0fa82e6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m58q2\" (UID: \"0fe013e6-ff17-415a-af5a-c96be0fa82e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m58q2" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.653906 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f49sq\" (UniqueName: \"kubernetes.io/projected/53ea92d3-1ca4-4663-9a90-c9cb24c6bec1-kube-api-access-f49sq\") pod \"downloads-7954f5f757-wjw44\" (UID: \"53ea92d3-1ca4-4663-9a90-c9cb24c6bec1\") " pod="openshift-console/downloads-7954f5f757-wjw44" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.664323 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqmnd\" (UniqueName: \"kubernetes.io/projected/f22939da-a96d-4aab-8446-5452654bac1e-kube-api-access-fqmnd\") pod \"openshift-config-operator-7777fb866f-g79fj\" (UID: \"f22939da-a96d-4aab-8446-5452654bac1e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g79fj" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.673937 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:11 crc kubenswrapper[4776]: E0128 06:51:11.674206 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:12.174173013 +0000 UTC m=+43.589833173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.674774 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: E0128 06:51:11.675404 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:12.175395404 +0000 UTC m=+43.591055564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.678865 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m58q2" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.683313 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28lh4\" (UniqueName: \"kubernetes.io/projected/dbdea8ef-a044-48ca-bfac-19023c9fb55d-kube-api-access-28lh4\") pod \"machine-config-controller-84d6567774-xq6gp\" (UID: \"dbdea8ef-a044-48ca-bfac-19023c9fb55d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.687584 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g79fj" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.702636 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdbgm\" (UniqueName: \"kubernetes.io/projected/56d29578-4c86-4128-adda-2fd5398645a5-kube-api-access-xdbgm\") pod \"machine-approver-56656f9798-qqqv5\" (UID: \"56d29578-4c86-4128-adda-2fd5398645a5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.703791 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwz2q"] Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.717840 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.724035 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v68k2\" (UniqueName: \"kubernetes.io/projected/0b392476-ce74-4f7f-a12f-920531623ef6-kube-api-access-v68k2\") pod \"authentication-operator-69f744f599-5p2jq\" (UID: \"0b392476-ce74-4f7f-a12f-920531623ef6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.737188 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksvxw\" (UniqueName: \"kubernetes.io/projected/2e9f99f8-da42-4c58-a58f-9ebd2a12fca3-kube-api-access-ksvxw\") pod \"cluster-samples-operator-665b6dd947-vv9ms\" (UID: \"2e9f99f8-da42-4c58-a58f-9ebd2a12fca3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv9ms" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.745952 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wjw44" Jan 28 06:51:11 crc kubenswrapper[4776]: W0128 06:51:11.757984 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48500af3_a3ce_4ca8_a6bd_379cb2a0129a.slice/crio-6ca7a8e953e2450c52c9a1fe9f4b2472306d74eb637f6ba3f5bdfc7fdc989aad WatchSource:0}: Error finding container 6ca7a8e953e2450c52c9a1fe9f4b2472306d74eb637f6ba3f5bdfc7fdc989aad: Status 404 returned error can't find the container with id 6ca7a8e953e2450c52c9a1fe9f4b2472306d74eb637f6ba3f5bdfc7fdc989aad Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.765470 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngs2k\" (UniqueName: \"kubernetes.io/projected/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-kube-api-access-ngs2k\") pod \"route-controller-manager-6576b87f9c-zcm9b\" (UID: \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.777925 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d5hf2"] Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.778055 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.778704 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7jvr\" (UniqueName: \"kubernetes.io/projected/db28956e-c117-4203-ba6e-c6eadf3908f7-kube-api-access-h7jvr\") pod \"apiserver-7bbb656c7d-2qxtd\" (UID: \"db28956e-c117-4203-ba6e-c6eadf3908f7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: E0128 06:51:11.778824 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:12.278803684 +0000 UTC m=+43.694463844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.799858 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z28ng\" (UniqueName: \"kubernetes.io/projected/29614791-cdee-451e-b670-ac7f3d34d9bb-kube-api-access-z28ng\") pod \"etcd-operator-b45778765-k9524\" (UID: \"29614791-cdee-451e-b670-ac7f3d34d9bb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.818403 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm97q\" (UniqueName: \"kubernetes.io/projected/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-kube-api-access-lm97q\") pod \"console-f9d7485db-p5b6p\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.818634 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fz854"] Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.841152 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66v5v\" (UniqueName: \"kubernetes.io/projected/9c673c63-29c5-42eb-a59a-1350e12bffd7-kube-api-access-66v5v\") pod \"cluster-image-registry-operator-dc59b4c8b-r48r6\" (UID: \"9c673c63-29c5-42eb-a59a-1350e12bffd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.852885 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.862993 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqr8x\" (UniqueName: \"kubernetes.io/projected/39ece0d0-d290-4488-9111-f4784bebc3b2-kube-api-access-jqr8x\") pod \"apiserver-76f77b778f-8ddhh\" (UID: \"39ece0d0-d290-4488-9111-f4784bebc3b2\") " pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.881262 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: E0128 06:51:11.881725 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:12.381703853 +0000 UTC m=+43.797364103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.882733 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw2wp\" (UniqueName: \"kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-kube-api-access-fw2wp\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.882784 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.902933 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.904640 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c673c63-29c5-42eb-a59a-1350e12bffd7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-r48r6\" (UID: \"9c673c63-29c5-42eb-a59a-1350e12bffd7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.920382 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.924453 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m58q2"] Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.925059 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv9ms" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.933439 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.940764 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6c5e07b-212c-404f-bfa8-e96c62028a2a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vxslm\" (UID: \"d6c5e07b-212c-404f-bfa8-e96c62028a2a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vxslm" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.943717 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74ck7\" (UniqueName: \"kubernetes.io/projected/43ce9486-c553-4d64-92fb-20402352c29f-kube-api-access-74ck7\") pod \"oauth-openshift-558db77b4-wjm9m\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.946292 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.961115 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.969635 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-bound-sa-token\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.969755 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.980918 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6njzh\" (UniqueName: \"kubernetes.io/projected/8d8916b5-df67-4492-8c28-8f0d872a4997-kube-api-access-6njzh\") pod \"olm-operator-6b444d44fb-4djdr\" (UID: \"8d8916b5-df67-4492-8c28-8f0d872a4997\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.982128 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:11 crc kubenswrapper[4776]: E0128 06:51:11.982803 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:12.482784677 +0000 UTC m=+43.898444837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.994512 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vxslm" Jan 28 06:51:11 crc kubenswrapper[4776]: I0128 06:51:11.999438 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d4gb\" (UniqueName: \"kubernetes.io/projected/af394275-eaa5-46bb-a956-97b40d959b18-kube-api-access-9d4gb\") pod \"ingress-operator-5b745b69d9-56zbt\" (UID: \"af394275-eaa5-46bb-a956-97b40d959b18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.010757 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.013970 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-g79fj"] Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.019541 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9fmw6"] Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.021111 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr7q7\" (UniqueName: \"kubernetes.io/projected/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-kube-api-access-fr7q7\") pod \"collect-profiles-29493045-67htw\" (UID: \"784215d5-15f7-4ff3-b0b5-f176cc7b14b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.037273 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rnms\" (UniqueName: \"kubernetes.io/projected/29a5bcd2-ab98-42f3-b1f8-0aca08cc1552-kube-api-access-6rnms\") pod \"catalog-operator-68c6474976-hs6h4\" (UID: \"29a5bcd2-ab98-42f3-b1f8-0aca08cc1552\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.060209 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhd7b\" (UniqueName: \"kubernetes.io/projected/1d563f89-2d21-46cb-a830-2d5b7403f7a1-kube-api-access-rhd7b\") pod \"packageserver-d55dfcdfc-vr8bd\" (UID: \"1d563f89-2d21-46cb-a830-2d5b7403f7a1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" Jan 28 06:51:12 crc kubenswrapper[4776]: W0128 06:51:12.063537 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf22939da_a96d_4aab_8446_5452654bac1e.slice/crio-7515b7d2ab8415707884169b6a1e748e51b6ad88ab096d617e2b21fcb058f3e0 WatchSource:0}: Error finding container 7515b7d2ab8415707884169b6a1e748e51b6ad88ab096d617e2b21fcb058f3e0: Status 404 returned error can't find the container with id 7515b7d2ab8415707884169b6a1e748e51b6ad88ab096d617e2b21fcb058f3e0 Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.066195 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.082389 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpdbr\" (UniqueName: \"kubernetes.io/projected/9ac969b9-e7de-444a-902f-c2117d50769d-kube-api-access-kpdbr\") pod \"service-ca-9c57cc56f-g4qds\" (UID: \"9ac969b9-e7de-444a-902f-c2117d50769d\") " pod="openshift-service-ca/service-ca-9c57cc56f-g4qds" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.084292 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:12 crc kubenswrapper[4776]: E0128 06:51:12.086219 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:12.586202458 +0000 UTC m=+44.001862628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.108492 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46t59\" (UniqueName: \"kubernetes.io/projected/97135081-7759-4edc-aa62-514c15190115-kube-api-access-46t59\") pod \"marketplace-operator-79b997595-2vn7f\" (UID: \"97135081-7759-4edc-aa62-514c15190115\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.118159 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wjw44"] Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.118348 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.124146 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1f7c9ee3-b72d-4af7-998f-cad0df531c31-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fnvp9\" (UID: \"1f7c9ee3-b72d-4af7-998f-cad0df531c31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnvp9" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.127680 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.136801 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.143640 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-p5b6p"] Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.147442 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txjcm\" (UniqueName: \"kubernetes.io/projected/55effd60-d9e1-4104-ac9a-2ed1d9c7e31b-kube-api-access-txjcm\") pod \"multus-admission-controller-857f4d67dd-lft68\" (UID: \"55effd60-d9e1-4104-ac9a-2ed1d9c7e31b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lft68" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.158709 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n89s\" (UniqueName: \"kubernetes.io/projected/d26e3039-7460-49d9-8f89-637d57601639-kube-api-access-8n89s\") pod \"csi-hostpathplugin-4ndtq\" (UID: \"d26e3039-7460-49d9-8f89-637d57601639\") " pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.158844 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.177337 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-g4qds" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.180974 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mdpz\" (UniqueName: \"kubernetes.io/projected/5aa8b2d7-b79e-4478-89be-bd227f7715b7-kube-api-access-8mdpz\") pod \"service-ca-operator-777779d784-dr6wg\" (UID: \"5aa8b2d7-b79e-4478-89be-bd227f7715b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dr6wg" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.185069 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:12 crc kubenswrapper[4776]: E0128 06:51:12.185296 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:12.685273054 +0000 UTC m=+44.100933214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.185622 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:12 crc kubenswrapper[4776]: E0128 06:51:12.186026 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:12.686011112 +0000 UTC m=+44.101671272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.192244 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.198392 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af394275-eaa5-46bb-a956-97b40d959b18-bound-sa-token\") pod \"ingress-operator-5b745b69d9-56zbt\" (UID: \"af394275-eaa5-46bb-a956-97b40d959b18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.230642 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsv9w\" (UniqueName: \"kubernetes.io/projected/ea1a3f88-1cd5-44af-8f5e-5713cd39b4d6-kube-api-access-dsv9w\") pod \"migrator-59844c95c7-h4g2z\" (UID: \"ea1a3f88-1cd5-44af-8f5e-5713cd39b4d6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-h4g2z" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.245473 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q8s6\" (UniqueName: \"kubernetes.io/projected/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-kube-api-access-5q8s6\") pod \"cni-sysctl-allowlist-ds-xlbwv\" (UID: \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.247887 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.272043 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr76r\" (UniqueName: \"kubernetes.io/projected/2d298bb7-36e2-4d16-97f5-0ee37018f44a-kube-api-access-gr76r\") pod \"dns-default-pkbgp\" (UID: \"2d298bb7-36e2-4d16-97f5-0ee37018f44a\") " pod="openshift-dns/dns-default-pkbgp" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.281227 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-p5b6p" event={"ID":"a6e763c5-5d99-4374-9ade-5ac3ff4b9817","Type":"ContainerStarted","Data":"3d89ab939b57aedd844c91620553f4e231a51d0fdc5bace9ea9e7914f20dd8cd"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.284455 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" event={"ID":"be5bb707-a7a1-4b88-a75e-0093c14a4764","Type":"ContainerStarted","Data":"ce4d259758def76a79a071f5bef5eab36501b29656f2d097f1e9b3c2c96d81c0"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.285646 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzp4b\" (UniqueName: \"kubernetes.io/projected/692617f2-c85f-42ce-b008-feff57211b45-kube-api-access-zzp4b\") pod \"machine-config-operator-74547568cd-hxpsz\" (UID: \"692617f2-c85f-42ce-b008-feff57211b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.286344 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:12 crc kubenswrapper[4776]: E0128 06:51:12.286792 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:12.786771628 +0000 UTC m=+44.202431788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.307504 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5c9g\" (UniqueName: \"kubernetes.io/projected/fba465ce-f898-4cca-b8f1-6281aef02eb7-kube-api-access-z5c9g\") pod \"ingress-canary-2znv8\" (UID: \"fba465ce-f898-4cca-b8f1-6281aef02eb7\") " pod="openshift-ingress-canary/ingress-canary-2znv8" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.317373 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.322755 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2gx7\" (UniqueName: \"kubernetes.io/projected/071d005d-cd96-4f28-b644-982b0f846135-kube-api-access-l2gx7\") pod \"router-default-5444994796-tcdcf\" (UID: \"071d005d-cd96-4f28-b644-982b0f846135\") " pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.323139 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nltsm" event={"ID":"5737e5b8-0513-4c1b-b4a4-6f5812f83d4b","Type":"ContainerStarted","Data":"f5aced4f6b555be87a5d3d91cb4f6c2c44168acd676a52d4b7833a375d298f87"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.323180 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nltsm" event={"ID":"5737e5b8-0513-4c1b-b4a4-6f5812f83d4b","Type":"ContainerStarted","Data":"ed699e964e01e023fb64ef1cc9650de27c3697a87c17dcab8697aa2e4c120e36"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.343280 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b"] Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.343590 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" event={"ID":"4fb7ccb2-9c11-4273-9888-f45aea05803d","Type":"ContainerStarted","Data":"7b808f13fb01f16987162f1a47d875611c93c29146678d00ce7b5ca860f0f17d"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.343630 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" event={"ID":"4fb7ccb2-9c11-4273-9888-f45aea05803d","Type":"ContainerStarted","Data":"ac5683e30a8172bc906f459033a7060e0b8fa25cb38546b058bdf21e984706cb"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.343641 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" event={"ID":"4fb7ccb2-9c11-4273-9888-f45aea05803d","Type":"ContainerStarted","Data":"1f0597693c6ca14540daed91127d8b24d71bc9a88a665e9818919033a47403bd"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.344836 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m58q2" event={"ID":"0fe013e6-ff17-415a-af5a-c96be0fa82e6","Type":"ContainerStarted","Data":"b017338b301e25881824191bd1689e4f916ee47162b461607f3a8327939616cd"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.346158 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" event={"ID":"56d29578-4c86-4128-adda-2fd5398645a5","Type":"ContainerStarted","Data":"8dd92f8f38e560f5f0c7d25caa2bbb84de92300df74e57653078b70fe6903eb2"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.355171 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fz854" event={"ID":"4835eb43-8c9b-4153-9b80-02aeeab54cef","Type":"ContainerStarted","Data":"de6357c8c3d5c1236731c41bd41a8286e4a559286fa57ad90a02f855158eacf9"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.355223 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fz854" event={"ID":"4835eb43-8c9b-4153-9b80-02aeeab54cef","Type":"ContainerStarted","Data":"f8f8a864c23b5ea46e79fb043273974d0f5a399831f6fc65ceb77f17a833fe10"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.355603 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-fz854" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.359066 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp8r6\" (UniqueName: \"kubernetes.io/projected/b125288e-95aa-474b-9c87-17f11147206f-kube-api-access-mp8r6\") pod \"kube-storage-version-migrator-operator-b67b599dd-2pxtc\" (UID: \"b125288e-95aa-474b-9c87-17f11147206f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2pxtc" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.359434 4776 patch_prober.go:28] interesting pod/console-operator-58897d9998-fz854 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.359466 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-fz854" podUID="4835eb43-8c9b-4153-9b80-02aeeab54cef" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.361587 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wjw44" event={"ID":"53ea92d3-1ca4-4663-9a90-c9cb24c6bec1","Type":"ContainerStarted","Data":"64c9f58abee3b8abed78332e9f37150a6704b73b3f31cabbffb5c69cd23cb404"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.368952 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d5hf2" event={"ID":"3819a037-a2a1-433f-884e-84bead904558","Type":"ContainerStarted","Data":"fc5d1c3544699c9ae74b3308564988dc20d4d2111e16e4035fe324d3026ef6aa"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.370530 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df8z6\" (UniqueName: \"kubernetes.io/projected/a8386b67-8be2-4d18-9358-fccd65c363db-kube-api-access-df8z6\") pod \"control-plane-machine-set-operator-78cbb6b69f-5bxjg\" (UID: \"a8386b67-8be2-4d18-9358-fccd65c363db\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5bxjg" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.373525 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwz2q" event={"ID":"48500af3-a3ce-4ca8-a6bd-379cb2a0129a","Type":"ContainerStarted","Data":"17527d1a99b9c124efb4bc2965496bf3b486dfd02897226f9158f02181b47f62"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.373602 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwz2q" event={"ID":"48500af3-a3ce-4ca8-a6bd-379cb2a0129a","Type":"ContainerStarted","Data":"6ca7a8e953e2450c52c9a1fe9f4b2472306d74eb637f6ba3f5bdfc7fdc989aad"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.375304 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnvp9" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.376838 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k8nsq" event={"ID":"273d91f5-04f4-44b2-8ccf-843e85ea7c7b","Type":"ContainerStarted","Data":"e36ebc675bc29ce0ad118ab37e04ea9a5b6456f8cb26feae6dfb532ce1729a53"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.376877 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k8nsq" event={"ID":"273d91f5-04f4-44b2-8ccf-843e85ea7c7b","Type":"ContainerStarted","Data":"9ea20cdfc9df7f75a82ce2bc6bb981f61938bef3a7f7ec17218108ffa706055b"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.384362 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnddl\" (UniqueName: \"kubernetes.io/projected/2677aa4a-2578-4da6-aec2-ea5e949c94f7-kube-api-access-lnddl\") pod \"machine-config-server-8b4t7\" (UID: \"2677aa4a-2578-4da6-aec2-ea5e949c94f7\") " pod="openshift-machine-config-operator/machine-config-server-8b4t7" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.387802 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lft68" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.388594 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:12 crc kubenswrapper[4776]: E0128 06:51:12.388960 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:12.888947029 +0000 UTC m=+44.304607189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.395888 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd9sg\" (UniqueName: \"kubernetes.io/projected/90d1a64c-a8ef-4af9-a3ce-fa6357b570d7-kube-api-access-hd9sg\") pod \"package-server-manager-789f6589d5-sznv4\" (UID: \"90d1a64c-a8ef-4af9-a3ce-fa6357b570d7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sznv4" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.411076 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.431082 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g79fj" event={"ID":"f22939da-a96d-4aab-8446-5452654bac1e","Type":"ContainerStarted","Data":"7515b7d2ab8415707884169b6a1e748e51b6ad88ab096d617e2b21fcb058f3e0"} Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.446478 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dr6wg" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.487825 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-h4g2z" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.489326 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:12 crc kubenswrapper[4776]: E0128 06:51:12.490125 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:12.990082644 +0000 UTC m=+44.405742804 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.501136 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8b4t7" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.506477 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2znv8" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.516022 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pkbgp" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.526035 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.540380 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd"] Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.591398 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:12 crc kubenswrapper[4776]: E0128 06:51:12.591886 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:13.091861946 +0000 UTC m=+44.507522106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.608761 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.629267 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5bxjg" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.633733 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv9ms"] Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.638133 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wjm9m"] Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.651880 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vxslm"] Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.656939 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2pxtc" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.670371 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5p2jq"] Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.695049 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:12 crc kubenswrapper[4776]: E0128 06:51:12.695770 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:13.195748529 +0000 UTC m=+44.611408689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.695894 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp"] Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.696285 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sznv4" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.698721 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8ddhh"] Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.796828 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:12 crc kubenswrapper[4776]: E0128 06:51:12.797459 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:13.297432747 +0000 UTC m=+44.713093107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.893643 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd"] Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.899771 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:12 crc kubenswrapper[4776]: E0128 06:51:12.900448 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:13.400422578 +0000 UTC m=+44.816082738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.918583 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2vn7f"] Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.919028 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-fz854" podStartSLOduration=23.919004322 podStartE2EDuration="23.919004322s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:12.915525968 +0000 UTC m=+44.331186128" watchObservedRunningTime="2026-01-28 06:51:12.919004322 +0000 UTC m=+44.334664482" Jan 28 06:51:12 crc kubenswrapper[4776]: I0128 06:51:12.953139 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwz2q" podStartSLOduration=24.953117648 podStartE2EDuration="24.953117648s" podCreationTimestamp="2026-01-28 06:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:12.952104813 +0000 UTC m=+44.367764973" watchObservedRunningTime="2026-01-28 06:51:12.953117648 +0000 UTC m=+44.368777808" Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.002207 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:13 crc kubenswrapper[4776]: E0128 06:51:13.002635 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:13.502619309 +0000 UTC m=+44.918279479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.066658 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k9524"] Jan 28 06:51:13 crc kubenswrapper[4776]: W0128 06:51:13.083768 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod071d005d_cd96_4f28_b644_982b0f846135.slice/crio-0ad8683ea1be0a9a4befd5c2bb7ffd675444677434867668788aad268b67c3e3 WatchSource:0}: Error finding container 0ad8683ea1be0a9a4befd5c2bb7ffd675444677434867668788aad268b67c3e3: Status 404 returned error can't find the container with id 0ad8683ea1be0a9a4befd5c2bb7ffd675444677434867668788aad268b67c3e3 Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.103268 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:13 crc kubenswrapper[4776]: E0128 06:51:13.103785 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:13.603760385 +0000 UTC m=+45.019420545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.197215 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k8nsq" podStartSLOduration=24.197194502 podStartE2EDuration="24.197194502s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:13.193481091 +0000 UTC m=+44.609141251" watchObservedRunningTime="2026-01-28 06:51:13.197194502 +0000 UTC m=+44.612854662" Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.205130 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:13 crc kubenswrapper[4776]: E0128 06:51:13.205516 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:13.705499876 +0000 UTC m=+45.121160036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.308352 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:13 crc kubenswrapper[4776]: E0128 06:51:13.310004 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:13.809988493 +0000 UTC m=+45.225648643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.342845 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-xm8h7" podStartSLOduration=24.342820287 podStartE2EDuration="24.342820287s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:13.331596302 +0000 UTC m=+44.747256462" watchObservedRunningTime="2026-01-28 06:51:13.342820287 +0000 UTC m=+44.758480447" Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.413180 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:13 crc kubenswrapper[4776]: E0128 06:51:13.413540 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:13.913525937 +0000 UTC m=+45.329186097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.513983 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:13 crc kubenswrapper[4776]: E0128 06:51:13.514500 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:14.014381686 +0000 UTC m=+45.430041856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.514681 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:13 crc kubenswrapper[4776]: E0128 06:51:13.515118 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:14.015097793 +0000 UTC m=+45.430757953 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.539815 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" event={"ID":"1d563f89-2d21-46cb-a830-2d5b7403f7a1","Type":"ContainerStarted","Data":"31e6ffdd7707967d20c4a9443ad62b74645badb7b12642f10a758aa9093ab5a9"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.541702 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" event={"ID":"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a","Type":"ContainerStarted","Data":"d0b27cdc494f409d0213644148415452e415155acd88e7b7396d40c5e2104440"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.546992 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nltsm" event={"ID":"5737e5b8-0513-4c1b-b4a4-6f5812f83d4b","Type":"ContainerStarted","Data":"8d0029f1f9be6590dea1a94bdd4c84c22280c38cb6bd29749d7930b183ecf301"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.560715 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m58q2" event={"ID":"0fe013e6-ff17-415a-af5a-c96be0fa82e6","Type":"ContainerStarted","Data":"ed7d8d8840682cd5834044086b24e550b76139445f25146318a50216b86a752e"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.567191 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" event={"ID":"be5bb707-a7a1-4b88-a75e-0093c14a4764","Type":"ContainerStarted","Data":"46dc529a83d924236bf0101d580a2a32093a154ee95f9a222019c276d5d7eb15"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.568122 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.595307 4776 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9fmw6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.595372 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" podUID="be5bb707-a7a1-4b88-a75e-0093c14a4764" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.600406 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vxslm" event={"ID":"d6c5e07b-212c-404f-bfa8-e96c62028a2a","Type":"ContainerStarted","Data":"4ac62f23af2f9c5fe7a6fc93b1beb2d6a1cfef50c022cfd57fcba018dc492cb7"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.616354 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:13 crc kubenswrapper[4776]: E0128 06:51:13.619809 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:14.119777316 +0000 UTC m=+45.535437476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.631821 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d5hf2" event={"ID":"3819a037-a2a1-433f-884e-84bead904558","Type":"ContainerStarted","Data":"6bb4a22da27091dd002bf6c25a283b5742ca10ad67334836f242904ab8080004"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.645632 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" event={"ID":"43ce9486-c553-4d64-92fb-20402352c29f","Type":"ContainerStarted","Data":"65da0370bc7ab2d5f76776b32337aae98c4cb7240ae6df67f53650d395b5c3bc"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.655212 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" event={"ID":"39ece0d0-d290-4488-9111-f4784bebc3b2","Type":"ContainerStarted","Data":"6bfec158e4ad95ff07917b363d68b352a238c5d80541667ccf3cdf063623466f"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.663561 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" event={"ID":"0b392476-ce74-4f7f-a12f-920531623ef6","Type":"ContainerStarted","Data":"a36a4d8c81a92bbd5d894dab21ba8839f6491a51219253307f6f4ef11c2d9a01"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.667196 4776 generic.go:334] "Generic (PLEG): container finished" podID="f22939da-a96d-4aab-8446-5452654bac1e" containerID="7dc9fe4bdc2964e3a52738bd18ef183c04276a478ba252aaf1190edd0ec1c866" exitCode=0 Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.667297 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g79fj" event={"ID":"f22939da-a96d-4aab-8446-5452654bac1e","Type":"ContainerDied","Data":"7dc9fe4bdc2964e3a52738bd18ef183c04276a478ba252aaf1190edd0ec1c866"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.674742 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-p5b6p" event={"ID":"a6e763c5-5d99-4374-9ade-5ac3ff4b9817","Type":"ContainerStarted","Data":"ed924b20152d9bdae24c3c1cf7e7602c24fcc7baa86710f20557b7388ab1cfd7"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.677528 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wjw44" event={"ID":"53ea92d3-1ca4-4663-9a90-c9cb24c6bec1","Type":"ContainerStarted","Data":"eda642454164e970565ea78e2e9fcb77d9d74fa3a8f900a8396c4bf22c66e583"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.678365 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-wjw44" Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.684789 4776 patch_prober.go:28] interesting pod/downloads-7954f5f757-wjw44 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.684843 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wjw44" podUID="53ea92d3-1ca4-4663-9a90-c9cb24c6bec1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.692409 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" event={"ID":"29614791-cdee-451e-b670-ac7f3d34d9bb","Type":"ContainerStarted","Data":"ad339e3abc6a19747cc740ca32b982f38e44f9171c2668fe6ffc0247d6c0f017"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.693341 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4ndtq"] Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.717997 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8b4t7" event={"ID":"2677aa4a-2578-4da6-aec2-ea5e949c94f7","Type":"ContainerStarted","Data":"9d83e11feba9432ef7f76c4221cfa706a4abfbd6c12b212300ba5113e13209b0"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.718381 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:13 crc kubenswrapper[4776]: E0128 06:51:13.718740 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:14.218723568 +0000 UTC m=+45.634383728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.725656 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv9ms" event={"ID":"2e9f99f8-da42-4c58-a58f-9ebd2a12fca3","Type":"ContainerStarted","Data":"3f91a0d3ad0ddb1b28229e122d43d16669c89939848380f38751aaa6d3517cc6"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.728713 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" event={"ID":"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b","Type":"ContainerStarted","Data":"521670f1b900680a57e51c2be6942d15b588053f6d0e2c6fe35212772d88f820"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.728755 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" event={"ID":"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b","Type":"ContainerStarted","Data":"fcbf74d9e1476fd5be6f8cc66a8680eb901e71291185a73fd22143e7229e1ab0"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.729696 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.734053 4776 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-zcm9b container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.734114 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" podUID="f1a25b7e-31ff-40e1-9aa0-07ef01b6333b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.734770 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp" event={"ID":"dbdea8ef-a044-48ca-bfac-19023c9fb55d","Type":"ContainerStarted","Data":"5016f1d42831029a893649eaf9cd6d831ad9375d3ba073318c6a8325f2c6cdda"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.816456 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" event={"ID":"db28956e-c117-4203-ba6e-c6eadf3908f7","Type":"ContainerStarted","Data":"eaf2bbfa704bbb0b49c64e1293fcad7483773b40a130b074000d19c51ef99751"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.819103 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:13 crc kubenswrapper[4776]: E0128 06:51:13.820394 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:14.320378736 +0000 UTC m=+45.736038896 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.833853 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr"] Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.844529 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tcdcf" event={"ID":"071d005d-cd96-4f28-b644-982b0f846135","Type":"ContainerStarted","Data":"0ad8683ea1be0a9a4befd5c2bb7ffd675444677434867668788aad268b67c3e3"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.869618 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" event={"ID":"97135081-7759-4edc-aa62-514c15190115","Type":"ContainerStarted","Data":"05eebedb8da824e67350dd2d5ac32ad451d9fb7bee97a32cfbcd015f02bbe002"} Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.893650 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-g4qds"] Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.923322 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:13 crc kubenswrapper[4776]: E0128 06:51:13.925142 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:14.42512555 +0000 UTC m=+45.840785710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.954753 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lft68"] Jan 28 06:51:13 crc kubenswrapper[4776]: I0128 06:51:13.954827 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw"] Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.026700 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:14 crc kubenswrapper[4776]: E0128 06:51:14.027321 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:14.527289551 +0000 UTC m=+45.942949711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.048291 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4"] Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.058987 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnvp9"] Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.061426 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6"] Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.117869 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz"] Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.132661 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-h4g2z"] Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.132799 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:14 crc kubenswrapper[4776]: E0128 06:51:14.133353 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:14.633331126 +0000 UTC m=+46.048991286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:14 crc kubenswrapper[4776]: W0128 06:51:14.165650 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod692617f2_c85f_42ce_b008_feff57211b45.slice/crio-a6bf4866e02fdfccb08e22129465e2e2cef8cb3da3f4389d6b686b6208e166f4 WatchSource:0}: Error finding container a6bf4866e02fdfccb08e22129465e2e2cef8cb3da3f4389d6b686b6208e166f4: Status 404 returned error can't find the container with id a6bf4866e02fdfccb08e22129465e2e2cef8cb3da3f4389d6b686b6208e166f4 Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.169003 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pkbgp"] Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.192919 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt"] Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.194947 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2pxtc"] Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.207253 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2znv8"] Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.225892 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5bxjg"] Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.227995 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-fz854" Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.229192 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dr6wg"] Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.239194 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:14 crc kubenswrapper[4776]: E0128 06:51:14.239305 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:14.73928064 +0000 UTC m=+46.154940800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.239383 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:14 crc kubenswrapper[4776]: E0128 06:51:14.240227 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:14.740211462 +0000 UTC m=+46.155871622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.240676 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m58q2" podStartSLOduration=25.240659853 podStartE2EDuration="25.240659853s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:14.239083105 +0000 UTC m=+45.654743265" watchObservedRunningTime="2026-01-28 06:51:14.240659853 +0000 UTC m=+45.656320013" Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.256291 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sznv4"] Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.340740 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:14 crc kubenswrapper[4776]: E0128 06:51:14.341028 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:14.840997999 +0000 UTC m=+46.256658169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.341222 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:14 crc kubenswrapper[4776]: E0128 06:51:14.341651 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:14.841637445 +0000 UTC m=+46.257297595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.440561 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" podStartSLOduration=25.440530216 podStartE2EDuration="25.440530216s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:14.384926935 +0000 UTC m=+45.800587095" watchObservedRunningTime="2026-01-28 06:51:14.440530216 +0000 UTC m=+45.856190376" Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.442495 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:14 crc kubenswrapper[4776]: E0128 06:51:14.442599 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:14.942578045 +0000 UTC m=+46.358238195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.461656 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:14 crc kubenswrapper[4776]: E0128 06:51:14.462118 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:14.962104353 +0000 UTC m=+46.377764513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.570387 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:14 crc kubenswrapper[4776]: E0128 06:51:14.570683 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:15.070665891 +0000 UTC m=+46.486326051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.571093 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:14 crc kubenswrapper[4776]: E0128 06:51:14.571756 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:15.071738857 +0000 UTC m=+46.487399017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.672414 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:14 crc kubenswrapper[4776]: E0128 06:51:14.675508 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:15.175487446 +0000 UTC m=+46.591147616 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.717980 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-nltsm" podStartSLOduration=25.717964497 podStartE2EDuration="25.717964497s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:14.715715431 +0000 UTC m=+46.131375591" watchObservedRunningTime="2026-01-28 06:51:14.717964497 +0000 UTC m=+46.133624647" Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.758097 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-p5b6p" podStartSLOduration=25.758080708 podStartE2EDuration="25.758080708s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:14.755519675 +0000 UTC m=+46.171179835" watchObservedRunningTime="2026-01-28 06:51:14.758080708 +0000 UTC m=+46.173740868" Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.773790 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:14 crc kubenswrapper[4776]: E0128 06:51:14.774258 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:15.274240884 +0000 UTC m=+46.689901044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.809890 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" podStartSLOduration=25.809871835 podStartE2EDuration="25.809871835s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:14.808764269 +0000 UTC m=+46.224424429" watchObservedRunningTime="2026-01-28 06:51:14.809871835 +0000 UTC m=+46.225531995" Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.874959 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:14 crc kubenswrapper[4776]: E0128 06:51:14.875186 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:15.375155794 +0000 UTC m=+46.790815954 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.875602 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:14 crc kubenswrapper[4776]: E0128 06:51:14.876018 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:15.376007555 +0000 UTC m=+46.791667715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.881992 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-wjw44" podStartSLOduration=25.881972211 podStartE2EDuration="25.881972211s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:14.836354784 +0000 UTC m=+46.252014934" watchObservedRunningTime="2026-01-28 06:51:14.881972211 +0000 UTC m=+46.297632371" Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.890588 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" event={"ID":"56d29578-4c86-4128-adda-2fd5398645a5","Type":"ContainerStarted","Data":"5c791ff47b0f6eadea6f0c17ccfbae462cf5e7f76b3d0b1367fc416e4a933cd1"} Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.894350 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" event={"ID":"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a","Type":"ContainerStarted","Data":"9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09"} Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.896954 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.900140 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g79fj" event={"ID":"f22939da-a96d-4aab-8446-5452654bac1e","Type":"ContainerStarted","Data":"d516f7bacf7d0ef3aaaaf9084c2d695a4b0a2e7c86014775615bef5da65444ca"} Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.900173 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g79fj" Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.907777 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6" event={"ID":"9c673c63-29c5-42eb-a59a-1350e12bffd7","Type":"ContainerStarted","Data":"0525626b4ab5afa66c1a4d2a708f47f67eb26391244858300c65a7935e5df5b3"} Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.919077 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-h4g2z" event={"ID":"ea1a3f88-1cd5-44af-8f5e-5713cd39b4d6","Type":"ContainerStarted","Data":"f0cb58134975fc5f2edc93f500232b40ac0fd013603d6b900b0bea4807eb5e4c"} Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.929377 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vxslm" event={"ID":"d6c5e07b-212c-404f-bfa8-e96c62028a2a","Type":"ContainerStarted","Data":"60a44362c8f1d5a54e6310d69b825056b9c67df437d4a51d813f0038d2e30588"} Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.936989 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4" event={"ID":"29a5bcd2-ab98-42f3-b1f8-0aca08cc1552","Type":"ContainerStarted","Data":"8da44423b487dc4b2479611490b19741be1e8b5e5ce4e624fa04499e30497456"} Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.940051 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dr6wg" event={"ID":"5aa8b2d7-b79e-4478-89be-bd227f7715b7","Type":"ContainerStarted","Data":"cc343c0a9d70b146f5095a105113554eaa535206552a68d08f7a43eede95ee94"} Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.976291 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:14 crc kubenswrapper[4776]: E0128 06:51:14.976788 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:15.476765101 +0000 UTC m=+46.892425261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.987114 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv9ms" event={"ID":"2e9f99f8-da42-4c58-a58f-9ebd2a12fca3","Type":"ContainerStarted","Data":"fe8475baf7b43e07a3886883cb82fab57fcf7448ea8b5d3ff9b763f4509d2046"} Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.989132 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g79fj" podStartSLOduration=25.989121173 podStartE2EDuration="25.989121173s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:14.98773507 +0000 UTC m=+46.403395220" watchObservedRunningTime="2026-01-28 06:51:14.989121173 +0000 UTC m=+46.404781333" Jan 28 06:51:14 crc kubenswrapper[4776]: I0128 06:51:14.991418 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8b4t7" event={"ID":"2677aa4a-2578-4da6-aec2-ea5e949c94f7","Type":"ContainerStarted","Data":"8aa134dc15e0121d28fa2ff682ed86c52b04d44ee9fe580dc1418b34f4a4abfd"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.009768 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" event={"ID":"af394275-eaa5-46bb-a956-97b40d959b18","Type":"ContainerStarted","Data":"aad392736e5c862ff714cdbef7a002e3ccea6541d91f4043eaf5270b553a241d"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.016583 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d5hf2" event={"ID":"3819a037-a2a1-433f-884e-84bead904558","Type":"ContainerStarted","Data":"d3a1d9b743321da4c1e78c7d19532076427b187a82feb6e99b1b57fda67241ec"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.027482 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" podStartSLOduration=6.027460992 podStartE2EDuration="6.027460992s" podCreationTimestamp="2026-01-28 06:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:15.024613183 +0000 UTC m=+46.440273353" watchObservedRunningTime="2026-01-28 06:51:15.027460992 +0000 UTC m=+46.443121152" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.043416 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" event={"ID":"692617f2-c85f-42ce-b008-feff57211b45","Type":"ContainerStarted","Data":"a6bf4866e02fdfccb08e22129465e2e2cef8cb3da3f4389d6b686b6208e166f4"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.048269 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" event={"ID":"97135081-7759-4edc-aa62-514c15190115","Type":"ContainerStarted","Data":"0f38494f71fc8f23e68352e1eee1a97f2ae4839fb9f43d927bdf4a96b15e9980"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.061407 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.074753 4776 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2vn7f container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.074828 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" podUID="97135081-7759-4edc-aa62-514c15190115" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.075084 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr" event={"ID":"8d8916b5-df67-4492-8c28-8f0d872a4997","Type":"ContainerStarted","Data":"37d028f5733dab413721bba738150f55a536046ea7636dbc3a166693fe1ecd46"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.078824 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:15 crc kubenswrapper[4776]: E0128 06:51:15.086100 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:15.586081867 +0000 UTC m=+47.001742027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.105568 4776 generic.go:334] "Generic (PLEG): container finished" podID="db28956e-c117-4203-ba6e-c6eadf3908f7" containerID="6033cd2353e7b6a021be841b24a9621078e8670457d97753922998a68946f097" exitCode=0 Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.105667 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" event={"ID":"db28956e-c117-4203-ba6e-c6eadf3908f7","Type":"ContainerDied","Data":"6033cd2353e7b6a021be841b24a9621078e8670457d97753922998a68946f097"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.126451 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8b4t7" podStartSLOduration=7.126431034 podStartE2EDuration="7.126431034s" podCreationTimestamp="2026-01-28 06:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:15.12626365 +0000 UTC m=+46.541923810" watchObservedRunningTime="2026-01-28 06:51:15.126431034 +0000 UTC m=+46.542091194" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.132409 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" event={"ID":"784215d5-15f7-4ff3-b0b5-f176cc7b14b2","Type":"ContainerStarted","Data":"cdfd9b87a3a8ef32183db88d792a1c130b5e9456aaec004ffebd8ff4ddbc0595"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.132461 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" event={"ID":"784215d5-15f7-4ff3-b0b5-f176cc7b14b2","Type":"ContainerStarted","Data":"d6a88a19616096f3f68e3d54274c7f09f1d080f23d360e51b3bdb7460990d66b"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.135988 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pkbgp" event={"ID":"2d298bb7-36e2-4d16-97f5-0ee37018f44a","Type":"ContainerStarted","Data":"af65bfc7eb7d170ccb96f4ced61048f1b4c65c4211fa79c69db9812a2e60aae9"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.151453 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-d5hf2" podStartSLOduration=26.151436967 podStartE2EDuration="26.151436967s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:15.149579142 +0000 UTC m=+46.565239312" watchObservedRunningTime="2026-01-28 06:51:15.151436967 +0000 UTC m=+46.567097127" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.154073 4776 generic.go:334] "Generic (PLEG): container finished" podID="39ece0d0-d290-4488-9111-f4784bebc3b2" containerID="9bd4f797b5f78c50ebb3dd53b94fe07524dc3439fb55ebf93a555fa95b400355" exitCode=0 Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.155145 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" event={"ID":"39ece0d0-d290-4488-9111-f4784bebc3b2","Type":"ContainerDied","Data":"9bd4f797b5f78c50ebb3dd53b94fe07524dc3439fb55ebf93a555fa95b400355"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.165116 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2pxtc" event={"ID":"b125288e-95aa-474b-9c87-17f11147206f","Type":"ContainerStarted","Data":"a5f88106c6fa45c34d9362bf896e1961ade65ccbe52168c8fdb1b160b83440b9"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.180290 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" event={"ID":"0b392476-ce74-4f7f-a12f-920531623ef6","Type":"ContainerStarted","Data":"52ccc77077806b2553e8db446635708d91e39bdaa34ec0a5e1fe73cf98d35e80"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.186101 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lft68" event={"ID":"55effd60-d9e1-4104-ac9a-2ed1d9c7e31b","Type":"ContainerStarted","Data":"2d9d8fbf83eb35f6814fc81bac7336164c2434f3e89085258c39f6ee35a8fb5b"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.202738 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:15 crc kubenswrapper[4776]: E0128 06:51:15.204145 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:15.704127937 +0000 UTC m=+47.119788097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.206078 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5bxjg" event={"ID":"a8386b67-8be2-4d18-9358-fccd65c363db","Type":"ContainerStarted","Data":"c1b5c26389ae3d71e6551941fd6efc2366ef029e9ecf66e479a91f5f2dddc886"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.218676 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sznv4" event={"ID":"90d1a64c-a8ef-4af9-a3ce-fa6357b570d7","Type":"ContainerStarted","Data":"547f9a8fbba412c7c8e2ebc4775375b6cef54a5cee31fe3fe2b9f7b080c3a179"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.222663 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-g4qds" event={"ID":"9ac969b9-e7de-444a-902f-c2117d50769d","Type":"ContainerStarted","Data":"880b56b970584baf6eb612aa9b8b8377a68c3a68d947931502039feab62ebc8e"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.222707 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-g4qds" event={"ID":"9ac969b9-e7de-444a-902f-c2117d50769d","Type":"ContainerStarted","Data":"8956372d7d62453f501c01bbbf3fe0a72be3481c0c52a65090e4e71c343e1e39"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.226874 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" podStartSLOduration=26.226856643 podStartE2EDuration="26.226856643s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:15.224933876 +0000 UTC m=+46.640594036" watchObservedRunningTime="2026-01-28 06:51:15.226856643 +0000 UTC m=+46.642516803" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.227523 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vxslm" podStartSLOduration=26.2275132 podStartE2EDuration="26.2275132s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:15.202204069 +0000 UTC m=+46.617864229" watchObservedRunningTime="2026-01-28 06:51:15.2275132 +0000 UTC m=+46.643173370" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.238353 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2znv8" event={"ID":"fba465ce-f898-4cca-b8f1-6281aef02eb7","Type":"ContainerStarted","Data":"7b7fffb4fb4af852f72a7595aa627091ae803bb47197159671e9e6a37fb8a70f"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.243861 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" event={"ID":"d26e3039-7460-49d9-8f89-637d57601639","Type":"ContainerStarted","Data":"1e04d56e0d7e2bdca61c135e3727254121016b8a10520b3695108d59ba604ad6"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.245874 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" podStartSLOduration=26.245858358 podStartE2EDuration="26.245858358s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:15.244213788 +0000 UTC m=+46.659873948" watchObservedRunningTime="2026-01-28 06:51:15.245858358 +0000 UTC m=+46.661518518" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.281425 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp" event={"ID":"dbdea8ef-a044-48ca-bfac-19023c9fb55d","Type":"ContainerStarted","Data":"fe09a2dad029f2c7bb967221cdb6415927a2f7cf27c9d16001cfb8e9d1a00584"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.294276 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tcdcf" event={"ID":"071d005d-cd96-4f28-b644-982b0f846135","Type":"ContainerStarted","Data":"d9c47443b875f74c61f07dd3c74ecbcb313aef4d996ba28c6712478565dd4ca6"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.304474 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:15 crc kubenswrapper[4776]: E0128 06:51:15.305934 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:15.805920048 +0000 UTC m=+47.221580208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.322882 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnvp9" event={"ID":"1f7c9ee3-b72d-4af7-998f-cad0df531c31","Type":"ContainerStarted","Data":"ca426d3cc3460bf5f24483c9591fc4a17e1641bb1b9348826544a9783afbe242"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.335837 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" event={"ID":"1d563f89-2d21-46cb-a830-2d5b7403f7a1","Type":"ContainerStarted","Data":"6a33262cab4625c384e63d484df845ebb85d599c9d5118ad9da9500cdfbf27e1"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.336826 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.340623 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" event={"ID":"43ce9486-c553-4d64-92fb-20402352c29f","Type":"ContainerStarted","Data":"b155a9d1e1540e3f15151994429b4067fbce04fef67bbcb887c8cf9b700efaa5"} Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.340659 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.343950 4776 patch_prober.go:28] interesting pod/downloads-7954f5f757-wjw44 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.343988 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wjw44" podUID="53ea92d3-1ca4-4663-9a90-c9cb24c6bec1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.351630 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.361645 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.362128 4776 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vr8bd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:5443/healthz\": dial tcp 10.217.0.18:5443: connect: connection refused" start-of-body= Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.362175 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" podUID="1d563f89-2d21-46cb-a830-2d5b7403f7a1" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.18:5443/healthz\": dial tcp 10.217.0.18:5443: connect: connection refused" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.379752 4776 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-wjm9m container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.20:6443/healthz\": dial tcp 10.217.0.20:6443: connect: connection refused" start-of-body= Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.379813 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" podUID="43ce9486-c553-4d64-92fb-20402352c29f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.20:6443/healthz\": dial tcp 10.217.0.20:6443: connect: connection refused" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.404309 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-g4qds" podStartSLOduration=26.404289427 podStartE2EDuration="26.404289427s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:15.344584255 +0000 UTC m=+46.760244415" watchObservedRunningTime="2026-01-28 06:51:15.404289427 +0000 UTC m=+46.819949587" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.405636 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:15 crc kubenswrapper[4776]: E0128 06:51:15.405863 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:15.905846724 +0000 UTC m=+47.321506884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.406204 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:15 crc kubenswrapper[4776]: E0128 06:51:15.413088 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:15.913070651 +0000 UTC m=+47.328730811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.482440 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-5p2jq" podStartSLOduration=27.482417148 podStartE2EDuration="27.482417148s" podCreationTimestamp="2026-01-28 06:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:15.457370326 +0000 UTC m=+46.873030486" watchObservedRunningTime="2026-01-28 06:51:15.482417148 +0000 UTC m=+46.898077308" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.507247 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:15 crc kubenswrapper[4776]: E0128 06:51:15.528914 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:16.028882936 +0000 UTC m=+47.444543096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.553161 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" podStartSLOduration=26.55313818 podStartE2EDuration="26.55313818s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:15.532858643 +0000 UTC m=+46.948518803" watchObservedRunningTime="2026-01-28 06:51:15.55313818 +0000 UTC m=+46.968798340" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.577523 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp" podStartSLOduration=26.577494855 podStartE2EDuration="26.577494855s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:15.568417294 +0000 UTC m=+46.984077454" watchObservedRunningTime="2026-01-28 06:51:15.577494855 +0000 UTC m=+46.993155025" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.609075 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.621749 4776 patch_prober.go:28] interesting pod/router-default-5444994796-tcdcf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 06:51:15 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Jan 28 06:51:15 crc kubenswrapper[4776]: [+]process-running ok Jan 28 06:51:15 crc kubenswrapper[4776]: healthz check failed Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.621809 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcdcf" podUID="071d005d-cd96-4f28-b644-982b0f846135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.634983 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:15 crc kubenswrapper[4776]: E0128 06:51:15.635254 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:16.135243169 +0000 UTC m=+47.550903329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.640727 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" podStartSLOduration=27.640698343 podStartE2EDuration="27.640698343s" podCreationTimestamp="2026-01-28 06:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:15.606351313 +0000 UTC m=+47.022011473" watchObservedRunningTime="2026-01-28 06:51:15.640698343 +0000 UTC m=+47.056358503" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.641024 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-tcdcf" podStartSLOduration=26.641020641 podStartE2EDuration="26.641020641s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:15.633858866 +0000 UTC m=+47.049519026" watchObservedRunningTime="2026-01-28 06:51:15.641020641 +0000 UTC m=+47.056680801" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.730818 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnvp9" podStartSLOduration=26.730801998 podStartE2EDuration="26.730801998s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:15.730637484 +0000 UTC m=+47.146297644" watchObservedRunningTime="2026-01-28 06:51:15.730801998 +0000 UTC m=+47.146462158" Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.737177 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:15 crc kubenswrapper[4776]: E0128 06:51:15.737534 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:16.237518433 +0000 UTC m=+47.653178583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.843397 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:15 crc kubenswrapper[4776]: E0128 06:51:15.843996 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:16.343979479 +0000 UTC m=+47.759639639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.944200 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:15 crc kubenswrapper[4776]: E0128 06:51:15.944648 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:16.444515389 +0000 UTC m=+47.860175549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:15 crc kubenswrapper[4776]: I0128 06:51:15.948232 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:15 crc kubenswrapper[4776]: E0128 06:51:15.948710 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:16.448693552 +0000 UTC m=+47.864353712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.048737 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:16 crc kubenswrapper[4776]: E0128 06:51:16.048997 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:16.548947666 +0000 UTC m=+47.964607826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.050369 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:16 crc kubenswrapper[4776]: E0128 06:51:16.050854 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:16.550831422 +0000 UTC m=+47.966491582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.151776 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:16 crc kubenswrapper[4776]: E0128 06:51:16.152307 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:16.652275406 +0000 UTC m=+48.067935566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.152483 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:16 crc kubenswrapper[4776]: E0128 06:51:16.152959 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:16.652943212 +0000 UTC m=+48.068603372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.253763 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:16 crc kubenswrapper[4776]: E0128 06:51:16.253915 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:16.753891873 +0000 UTC m=+48.169552033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.254671 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:16 crc kubenswrapper[4776]: E0128 06:51:16.255006 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:16.754998069 +0000 UTC m=+48.170658229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.351821 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5bxjg" event={"ID":"a8386b67-8be2-4d18-9358-fccd65c363db","Type":"ContainerStarted","Data":"8ac3a347b1d61e0a3b18d7e43bd2e8c3dcc109cb2e87b7ff879efd44b6a763dd"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.355402 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:16 crc kubenswrapper[4776]: E0128 06:51:16.355731 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:16.855677964 +0000 UTC m=+48.271338124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.358972 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2pxtc" event={"ID":"b125288e-95aa-474b-9c87-17f11147206f","Type":"ContainerStarted","Data":"3a5d457601dde14e144806efee18c695fe99937f5e4cff6dd82d6290caffd501"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.368083 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6" event={"ID":"9c673c63-29c5-42eb-a59a-1350e12bffd7","Type":"ContainerStarted","Data":"b4381c32a9b061afabd1ba93aa976346ba19e30dfda37329b05b846df610cac1"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.383234 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sznv4" event={"ID":"90d1a64c-a8ef-4af9-a3ce-fa6357b570d7","Type":"ContainerStarted","Data":"2f091993168b15871193f34dfb67d4f0a27a179e6204f5d9bd2af1d12fd5080f"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.383284 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sznv4" event={"ID":"90d1a64c-a8ef-4af9-a3ce-fa6357b570d7","Type":"ContainerStarted","Data":"c570488e30a1cfcd1a4e2ba7832d1d795117703da7a362d136a59f9c842089a9"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.383359 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sznv4" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.391259 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" event={"ID":"db28956e-c117-4203-ba6e-c6eadf3908f7","Type":"ContainerStarted","Data":"aac0ef357e5c969ee925309e6da8f4217a054e05c83a8a69b2121b67fd42c760"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.393234 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fnvp9" event={"ID":"1f7c9ee3-b72d-4af7-998f-cad0df531c31","Type":"ContainerStarted","Data":"3baee6130a98799740f329ae5719ecb84bc7cb5bcee815dccec2fc5840c7a0d4"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.403137 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dr6wg" event={"ID":"5aa8b2d7-b79e-4478-89be-bd227f7715b7","Type":"ContainerStarted","Data":"fc6d0d7bf6bba5d7b7f9f20e6432d52b48b8ee74f45033e92c8d415ab0476193"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.405642 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5bxjg" podStartSLOduration=27.405628807 podStartE2EDuration="27.405628807s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:16.376453763 +0000 UTC m=+47.792113923" watchObservedRunningTime="2026-01-28 06:51:16.405628807 +0000 UTC m=+47.821288957" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.412353 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lft68" event={"ID":"55effd60-d9e1-4104-ac9a-2ed1d9c7e31b","Type":"ContainerStarted","Data":"3e3a40a31d4ebf1cbd00762345f03b760e54a098674e9e66e13d461b7b4f51d4"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.412396 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lft68" event={"ID":"55effd60-d9e1-4104-ac9a-2ed1d9c7e31b","Type":"ContainerStarted","Data":"abbef80c4b36b9b95c09c2df1ec22fb6049a664c057c7ae2796a23d5c42a4ace"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.414665 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" event={"ID":"af394275-eaa5-46bb-a956-97b40d959b18","Type":"ContainerStarted","Data":"82a2f31dbb94dde6b9ccd93fa27dbf1518d289d7943196b4f5d9826ebfc15ad3"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.414693 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" event={"ID":"af394275-eaa5-46bb-a956-97b40d959b18","Type":"ContainerStarted","Data":"17ecd36bf86cc6a84b2eed5bb445ff62bb0f0d21cf8271c325e2a57380238c94"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.419180 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2pxtc" podStartSLOduration=27.419168639 podStartE2EDuration="27.419168639s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:16.415475508 +0000 UTC m=+47.831135668" watchObservedRunningTime="2026-01-28 06:51:16.419168639 +0000 UTC m=+47.834828799" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.437922 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pkbgp" event={"ID":"2d298bb7-36e2-4d16-97f5-0ee37018f44a","Type":"ContainerStarted","Data":"5397c2aa81688191e97f268cc297e5e897f379ae374669942e172ff4b4a48c3d"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.437972 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pkbgp" event={"ID":"2d298bb7-36e2-4d16-97f5-0ee37018f44a","Type":"ContainerStarted","Data":"984dc649ee414c022a70f93526ec2148897ec9d11a9de1885c0209d225286d36"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.438663 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-pkbgp" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.443057 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr" event={"ID":"8d8916b5-df67-4492-8c28-8f0d872a4997","Type":"ContainerStarted","Data":"35bdc114e6ab139e02ddcd6f7f6e8cfe292d0efde9c68cca4cf03f96e8ddca40"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.443094 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.447652 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4" event={"ID":"29a5bcd2-ab98-42f3-b1f8-0aca08cc1552","Type":"ContainerStarted","Data":"cb4bdffd28ef4ca7e98282672db30bccd8dc9f20d54565bfc93c730b20e23d15"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.448072 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.457156 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.458583 4776 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-4djdr container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Jan 28 06:51:16 crc kubenswrapper[4776]: E0128 06:51:16.459307 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:16.95928664 +0000 UTC m=+48.374946920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.458682 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr" podUID="8d8916b5-df67-4492-8c28-8f0d872a4997" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.465296 4776 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-hs6h4 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.465366 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4" podUID="29a5bcd2-ab98-42f3-b1f8-0aca08cc1552" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.473132 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2znv8" event={"ID":"fba465ce-f898-4cca-b8f1-6281aef02eb7","Type":"ContainerStarted","Data":"7459cbaef0a5155efd4e369993c7e5841c244030d5d84baff77f475b61904ab9"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.484286 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r48r6" podStartSLOduration=27.484253641 podStartE2EDuration="27.484253641s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:16.482188531 +0000 UTC m=+47.897848691" watchObservedRunningTime="2026-01-28 06:51:16.484253641 +0000 UTC m=+47.899913801" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.491710 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv9ms" event={"ID":"2e9f99f8-da42-4c58-a58f-9ebd2a12fca3","Type":"ContainerStarted","Data":"dafb12fdf0f3553764f6fd60293a3038e846bdac4caa14a248e309bee06d7217"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.514820 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xq6gp" event={"ID":"dbdea8ef-a044-48ca-bfac-19023c9fb55d","Type":"ContainerStarted","Data":"b4f2d9d8fe24fe1d347a5f5b862bd48e18fff86309f256acedf1ad0eff4bd370"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.524976 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" event={"ID":"39ece0d0-d290-4488-9111-f4784bebc3b2","Type":"ContainerStarted","Data":"9046276ce893c065f843b511b1ba29e9bb7f44e34b27e1d2abd7321cfeaeea80"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.532408 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2znv8" podStartSLOduration=7.532381989 podStartE2EDuration="7.532381989s" podCreationTimestamp="2026-01-28 06:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:16.526055795 +0000 UTC m=+47.941715955" watchObservedRunningTime="2026-01-28 06:51:16.532381989 +0000 UTC m=+47.948042149" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.533319 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" event={"ID":"d26e3039-7460-49d9-8f89-637d57601639","Type":"ContainerStarted","Data":"2878c8be6432c320caca2a46e57a2bc1c5a2aaf4bc41520bb0579c7778a6acf7"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.544866 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" event={"ID":"56d29578-4c86-4128-adda-2fd5398645a5","Type":"ContainerStarted","Data":"2633eb71856cc9a588d66dd49ce53cd4feb80d1fb9e9049b8d8c9a46e1bf2530"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.558484 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:16 crc kubenswrapper[4776]: E0128 06:51:16.560731 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:17.060699162 +0000 UTC m=+48.476359322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.563978 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" event={"ID":"29614791-cdee-451e-b670-ac7f3d34d9bb","Type":"ContainerStarted","Data":"6aa68ef5d6a2836291bb1a143362cd16eb3aebd1225689a9240b252c59e46cff"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.611501 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" event={"ID":"692617f2-c85f-42ce-b008-feff57211b45","Type":"ContainerStarted","Data":"11b7337dddb9c705c9697b2eb0b66d0fe60d6720984b43eedc0ad1887617a97e"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.611576 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" event={"ID":"692617f2-c85f-42ce-b008-feff57211b45","Type":"ContainerStarted","Data":"5d92cab1930708097d78aa0b182b33e2b9737e788c0dde48f702dcb36617ff1b"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.630677 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pkbgp" podStartSLOduration=7.630648344 podStartE2EDuration="7.630648344s" podCreationTimestamp="2026-01-28 06:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:16.562892696 +0000 UTC m=+47.978552866" watchObservedRunningTime="2026-01-28 06:51:16.630648344 +0000 UTC m=+48.046308504" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.632470 4776 patch_prober.go:28] interesting pod/router-default-5444994796-tcdcf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 06:51:16 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Jan 28 06:51:16 crc kubenswrapper[4776]: [+]process-running ok Jan 28 06:51:16 crc kubenswrapper[4776]: healthz check failed Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.632528 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcdcf" podUID="071d005d-cd96-4f28-b644-982b0f846135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.635234 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-h4g2z" event={"ID":"ea1a3f88-1cd5-44af-8f5e-5713cd39b4d6","Type":"ContainerStarted","Data":"7e5bb4523f7146884846555477d1395417f045e22abb8c878130afdffa377236"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.635286 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-h4g2z" event={"ID":"ea1a3f88-1cd5-44af-8f5e-5713cd39b4d6","Type":"ContainerStarted","Data":"fae4945b3ab0ae4d9ec3ec3b1b7ba77fa7abbd6252e5db26c79b0dd1ad7995a1"} Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.637450 4776 patch_prober.go:28] interesting pod/downloads-7954f5f757-wjw44 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.637520 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wjw44" podUID="53ea92d3-1ca4-4663-9a90-c9cb24c6bec1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.638240 4776 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2vn7f container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.638265 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" podUID="97135081-7759-4edc-aa62-514c15190115" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.658023 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.661453 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:16 crc kubenswrapper[4776]: E0128 06:51:16.669162 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:17.169112216 +0000 UTC m=+48.584772376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.678414 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-lft68" podStartSLOduration=27.678387013 podStartE2EDuration="27.678387013s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:16.673642217 +0000 UTC m=+48.089302377" watchObservedRunningTime="2026-01-28 06:51:16.678387013 +0000 UTC m=+48.094047173" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.679879 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dr6wg" podStartSLOduration=27.679872070000002 podStartE2EDuration="27.67987207s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:16.629680411 +0000 UTC m=+48.045340601" watchObservedRunningTime="2026-01-28 06:51:16.67987207 +0000 UTC m=+48.095532230" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.715369 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.724459 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" podStartSLOduration=27.7244301 podStartE2EDuration="27.7244301s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:16.723837826 +0000 UTC m=+48.139497986" watchObservedRunningTime="2026-01-28 06:51:16.7244301 +0000 UTC m=+48.140090260" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.762888 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:16 crc kubenswrapper[4776]: E0128 06:51:16.763236 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:17.26321279 +0000 UTC m=+48.678872950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.794339 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr" podStartSLOduration=27.794317311 podStartE2EDuration="27.794317311s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:16.758876783 +0000 UTC m=+48.174536943" watchObservedRunningTime="2026-01-28 06:51:16.794317311 +0000 UTC m=+48.209977471" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.796050 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56zbt" podStartSLOduration=27.796036772 podStartE2EDuration="27.796036772s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:16.793572363 +0000 UTC m=+48.209232523" watchObservedRunningTime="2026-01-28 06:51:16.796036772 +0000 UTC m=+48.211696932" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.826976 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sznv4" podStartSLOduration=27.826960219 podStartE2EDuration="27.826960219s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:16.825574836 +0000 UTC m=+48.241235006" watchObservedRunningTime="2026-01-28 06:51:16.826960219 +0000 UTC m=+48.242620379" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.864138 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:16 crc kubenswrapper[4776]: E0128 06:51:16.864492 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:17.364480798 +0000 UTC m=+48.780140958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.882317 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4" podStartSLOduration=27.882295324 podStartE2EDuration="27.882295324s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:16.876796639 +0000 UTC m=+48.292456799" watchObservedRunningTime="2026-01-28 06:51:16.882295324 +0000 UTC m=+48.297955494" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.901305 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-h4g2z" podStartSLOduration=27.901285159 podStartE2EDuration="27.901285159s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:16.897760213 +0000 UTC m=+48.313420363" watchObservedRunningTime="2026-01-28 06:51:16.901285159 +0000 UTC m=+48.316945319" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.904086 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.904131 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.905909 4776 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-2qxtd container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.12:8443/livez\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.905958 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" podUID="db28956e-c117-4203-ba6e-c6eadf3908f7" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.12:8443/livez\": dial tcp 10.217.0.12:8443: connect: connection refused" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.938851 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.939200 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.964827 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:16 crc kubenswrapper[4776]: E0128 06:51:16.965395 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:17.465366158 +0000 UTC m=+48.881026318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.971011 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.997689 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qqqv5" podStartSLOduration=28.997671828 podStartE2EDuration="28.997671828s" podCreationTimestamp="2026-01-28 06:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:16.931569541 +0000 UTC m=+48.347229711" watchObservedRunningTime="2026-01-28 06:51:16.997671828 +0000 UTC m=+48.413331988" Jan 28 06:51:16 crc kubenswrapper[4776]: I0128 06:51:16.999540 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hxpsz" podStartSLOduration=27.999531494 podStartE2EDuration="27.999531494s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:16.997410962 +0000 UTC m=+48.413071122" watchObservedRunningTime="2026-01-28 06:51:16.999531494 +0000 UTC m=+48.415191644" Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.041480 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vr8bd" Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.066626 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:17 crc kubenswrapper[4776]: E0128 06:51:17.067229 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:17.567211711 +0000 UTC m=+48.982871871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.077150 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv9ms" podStartSLOduration=29.077129174 podStartE2EDuration="29.077129174s" podCreationTimestamp="2026-01-28 06:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:17.064089235 +0000 UTC m=+48.479749395" watchObservedRunningTime="2026-01-28 06:51:17.077129174 +0000 UTC m=+48.492789334" Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.169664 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:17 crc kubenswrapper[4776]: E0128 06:51:17.169998 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:17.669967996 +0000 UTC m=+49.085628156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.170087 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:17 crc kubenswrapper[4776]: E0128 06:51:17.170655 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:17.670623182 +0000 UTC m=+49.086283342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.269118 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-k9524" podStartSLOduration=28.269103303 podStartE2EDuration="28.269103303s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:17.201784615 +0000 UTC m=+48.617444775" watchObservedRunningTime="2026-01-28 06:51:17.269103303 +0000 UTC m=+48.684763463" Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.271600 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:17 crc kubenswrapper[4776]: E0128 06:51:17.271850 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:17.77183755 +0000 UTC m=+49.187497710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.287268 4776 csr.go:261] certificate signing request csr-5gjvw is approved, waiting to be issued Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.299013 4776 csr.go:257] certificate signing request csr-5gjvw is issued Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.317597 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-xlbwv"] Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.374900 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:17 crc kubenswrapper[4776]: E0128 06:51:17.375640 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:17.87562158 +0000 UTC m=+49.291281740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.475503 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:17 crc kubenswrapper[4776]: E0128 06:51:17.476076 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:17.976059159 +0000 UTC m=+49.391719319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.533759 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.576635 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:17 crc kubenswrapper[4776]: E0128 06:51:17.576924 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:18.076912898 +0000 UTC m=+49.492573058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.615452 4776 patch_prober.go:28] interesting pod/router-default-5444994796-tcdcf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 06:51:17 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Jan 28 06:51:17 crc kubenswrapper[4776]: [+]process-running ok Jan 28 06:51:17 crc kubenswrapper[4776]: healthz check failed Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.615840 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcdcf" podUID="071d005d-cd96-4f28-b644-982b0f846135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.653900 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" event={"ID":"d26e3039-7460-49d9-8f89-637d57601639","Type":"ContainerStarted","Data":"f1258855e8955af3702b4ed86341523b5ce5eac5dc169ebe2d61299aaf15cf89"} Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.670333 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" event={"ID":"39ece0d0-d290-4488-9111-f4784bebc3b2","Type":"ContainerStarted","Data":"794d3916ed93d8e2c428ff4f5b19eaa2259e89669c5d9f814af561e99963a852"} Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.674294 4776 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2vn7f container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.674353 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" podUID="97135081-7759-4edc-aa62-514c15190115" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.683012 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:17 crc kubenswrapper[4776]: E0128 06:51:17.683880 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:18.183847095 +0000 UTC m=+49.599507325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.688205 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hs6h4" Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.780520 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" podStartSLOduration=29.780503941 podStartE2EDuration="29.780503941s" podCreationTimestamp="2026-01-28 06:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:17.777009276 +0000 UTC m=+49.192669436" watchObservedRunningTime="2026-01-28 06:51:17.780503941 +0000 UTC m=+49.196164101" Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.799105 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:17 crc kubenswrapper[4776]: E0128 06:51:17.820853 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:18.320720675 +0000 UTC m=+49.736380835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.862982 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4djdr" Jan 28 06:51:17 crc kubenswrapper[4776]: I0128 06:51:17.902867 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:17 crc kubenswrapper[4776]: E0128 06:51:17.903187 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:18.403169904 +0000 UTC m=+49.818830064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.004850 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:18 crc kubenswrapper[4776]: E0128 06:51:18.005288 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:18.505272283 +0000 UTC m=+49.920932443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.105987 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:18 crc kubenswrapper[4776]: E0128 06:51:18.106192 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:18.606157773 +0000 UTC m=+50.021817933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.106269 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:18 crc kubenswrapper[4776]: E0128 06:51:18.106617 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:18.606607543 +0000 UTC m=+50.022267703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.181644 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.182665 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.185275 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.185296 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.197633 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g79fj" Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.207298 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:18 crc kubenswrapper[4776]: E0128 06:51:18.207510 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:18.707476152 +0000 UTC m=+50.123136312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.207709 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.207763 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.207989 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:18 crc kubenswrapper[4776]: E0128 06:51:18.208293 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:18.708284383 +0000 UTC m=+50.123944543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.214900 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.300373 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-28 06:46:17 +0000 UTC, rotation deadline is 2026-10-24 09:01:34.82781257 +0000 UTC Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.300408 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6458h10m16.527407243s for next certificate rotation Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.309476 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:18 crc kubenswrapper[4776]: E0128 06:51:18.309672 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:18.809654394 +0000 UTC m=+50.225314554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.310349 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:18 crc kubenswrapper[4776]: E0128 06:51:18.310671 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:18.810663388 +0000 UTC m=+50.226323548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.310923 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.310954 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.311136 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.352386 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.412852 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:18 crc kubenswrapper[4776]: E0128 06:51:18.413288 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:18.91327031 +0000 UTC m=+50.328930470 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.496668 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.517295 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:18 crc kubenswrapper[4776]: E0128 06:51:18.517658 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:19.017639695 +0000 UTC m=+50.433299855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.613712 4776 patch_prober.go:28] interesting pod/router-default-5444994796-tcdcf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 06:51:18 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Jan 28 06:51:18 crc kubenswrapper[4776]: [+]process-running ok Jan 28 06:51:18 crc kubenswrapper[4776]: healthz check failed Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.613956 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcdcf" podUID="071d005d-cd96-4f28-b644-982b0f846135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.618826 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:18 crc kubenswrapper[4776]: E0128 06:51:18.619183 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:19.11916921 +0000 UTC m=+50.534829360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.685729 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" event={"ID":"d26e3039-7460-49d9-8f89-637d57601639","Type":"ContainerStarted","Data":"13f0f2e7f33be11aad7fcafb647d00c4bd9d27baef0439e0d84a7aea7540e361"} Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.688859 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" podUID="5a30af5d-0b53-4d54-af2a-2a4d5a296e6a" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09" gracePeriod=30 Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.721194 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:18 crc kubenswrapper[4776]: E0128 06:51:18.724100 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:19.224083508 +0000 UTC m=+50.639743668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.822973 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:18 crc kubenswrapper[4776]: E0128 06:51:18.823496 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:19.323475841 +0000 UTC m=+50.739136001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.836541 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2f2sb"] Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.838865 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2f2sb" Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.842121 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.851701 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2f2sb"] Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.874249 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.927184 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-catalog-content\") pod \"certified-operators-2f2sb\" (UID: \"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff\") " pod="openshift-marketplace/certified-operators-2f2sb" Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.927229 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-utilities\") pod \"certified-operators-2f2sb\" (UID: \"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff\") " pod="openshift-marketplace/certified-operators-2f2sb" Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.927278 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d6nt\" (UniqueName: \"kubernetes.io/projected/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-kube-api-access-8d6nt\") pod \"certified-operators-2f2sb\" (UID: \"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff\") " pod="openshift-marketplace/certified-operators-2f2sb" Jan 28 06:51:18 crc kubenswrapper[4776]: I0128 06:51:18.927317 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:18 crc kubenswrapper[4776]: E0128 06:51:18.927712 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:19.427692661 +0000 UTC m=+50.843352821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.023914 4776 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.028298 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:19 crc kubenswrapper[4776]: E0128 06:51:19.028520 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:19.528487859 +0000 UTC m=+50.944148019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.028636 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d6nt\" (UniqueName: \"kubernetes.io/projected/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-kube-api-access-8d6nt\") pod \"certified-operators-2f2sb\" (UID: \"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff\") " pod="openshift-marketplace/certified-operators-2f2sb" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.028704 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.029053 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-catalog-content\") pod \"certified-operators-2f2sb\" (UID: \"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff\") " pod="openshift-marketplace/certified-operators-2f2sb" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.029098 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-utilities\") pod \"certified-operators-2f2sb\" (UID: \"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff\") " pod="openshift-marketplace/certified-operators-2f2sb" Jan 28 06:51:19 crc kubenswrapper[4776]: E0128 06:51:19.029275 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:19.529243168 +0000 UTC m=+50.944903328 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.029595 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-utilities\") pod \"certified-operators-2f2sb\" (UID: \"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff\") " pod="openshift-marketplace/certified-operators-2f2sb" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.029619 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-catalog-content\") pod \"certified-operators-2f2sb\" (UID: \"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff\") " pod="openshift-marketplace/certified-operators-2f2sb" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.041443 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jvjv2"] Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.042849 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jvjv2" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.050741 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.051704 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d6nt\" (UniqueName: \"kubernetes.io/projected/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-kube-api-access-8d6nt\") pod \"certified-operators-2f2sb\" (UID: \"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff\") " pod="openshift-marketplace/certified-operators-2f2sb" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.053614 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jvjv2"] Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.131167 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:19 crc kubenswrapper[4776]: E0128 06:51:19.135726 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:19.635668343 +0000 UTC m=+51.051328513 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.145013 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxb24\" (UniqueName: \"kubernetes.io/projected/beb166aa-d9c2-4658-af43-8d5d2eb61588-kube-api-access-rxb24\") pod \"community-operators-jvjv2\" (UID: \"beb166aa-d9c2-4658-af43-8d5d2eb61588\") " pod="openshift-marketplace/community-operators-jvjv2" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.145140 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beb166aa-d9c2-4658-af43-8d5d2eb61588-catalog-content\") pod \"community-operators-jvjv2\" (UID: \"beb166aa-d9c2-4658-af43-8d5d2eb61588\") " pod="openshift-marketplace/community-operators-jvjv2" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.145265 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beb166aa-d9c2-4658-af43-8d5d2eb61588-utilities\") pod \"community-operators-jvjv2\" (UID: \"beb166aa-d9c2-4658-af43-8d5d2eb61588\") " pod="openshift-marketplace/community-operators-jvjv2" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.145397 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:19 crc kubenswrapper[4776]: E0128 06:51:19.146664 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:19.646637711 +0000 UTC m=+51.062297861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.168261 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2f2sb" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.234107 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5cqq4"] Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.237937 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5cqq4" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.246855 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:19 crc kubenswrapper[4776]: E0128 06:51:19.247017 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:19.746991257 +0000 UTC m=+51.162651417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.247134 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.247194 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxb24\" (UniqueName: \"kubernetes.io/projected/beb166aa-d9c2-4658-af43-8d5d2eb61588-kube-api-access-rxb24\") pod \"community-operators-jvjv2\" (UID: \"beb166aa-d9c2-4658-af43-8d5d2eb61588\") " pod="openshift-marketplace/community-operators-jvjv2" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.247229 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beb166aa-d9c2-4658-af43-8d5d2eb61588-catalog-content\") pod \"community-operators-jvjv2\" (UID: \"beb166aa-d9c2-4658-af43-8d5d2eb61588\") " pod="openshift-marketplace/community-operators-jvjv2" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.247275 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beb166aa-d9c2-4658-af43-8d5d2eb61588-utilities\") pod \"community-operators-jvjv2\" (UID: \"beb166aa-d9c2-4658-af43-8d5d2eb61588\") " pod="openshift-marketplace/community-operators-jvjv2" Jan 28 06:51:19 crc kubenswrapper[4776]: E0128 06:51:19.247489 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:19.747482269 +0000 UTC m=+51.163142429 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.247716 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beb166aa-d9c2-4658-af43-8d5d2eb61588-utilities\") pod \"community-operators-jvjv2\" (UID: \"beb166aa-d9c2-4658-af43-8d5d2eb61588\") " pod="openshift-marketplace/community-operators-jvjv2" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.247841 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beb166aa-d9c2-4658-af43-8d5d2eb61588-catalog-content\") pod \"community-operators-jvjv2\" (UID: \"beb166aa-d9c2-4658-af43-8d5d2eb61588\") " pod="openshift-marketplace/community-operators-jvjv2" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.259292 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5cqq4"] Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.278159 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxb24\" (UniqueName: \"kubernetes.io/projected/beb166aa-d9c2-4658-af43-8d5d2eb61588-kube-api-access-rxb24\") pod \"community-operators-jvjv2\" (UID: \"beb166aa-d9c2-4658-af43-8d5d2eb61588\") " pod="openshift-marketplace/community-operators-jvjv2" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.353569 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.353880 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f691d5a6-7d36-4834-8844-ccd5c12b6645-catalog-content\") pod \"certified-operators-5cqq4\" (UID: \"f691d5a6-7d36-4834-8844-ccd5c12b6645\") " pod="openshift-marketplace/certified-operators-5cqq4" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.353946 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztnwv\" (UniqueName: \"kubernetes.io/projected/f691d5a6-7d36-4834-8844-ccd5c12b6645-kube-api-access-ztnwv\") pod \"certified-operators-5cqq4\" (UID: \"f691d5a6-7d36-4834-8844-ccd5c12b6645\") " pod="openshift-marketplace/certified-operators-5cqq4" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.354011 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f691d5a6-7d36-4834-8844-ccd5c12b6645-utilities\") pod \"certified-operators-5cqq4\" (UID: \"f691d5a6-7d36-4834-8844-ccd5c12b6645\") " pod="openshift-marketplace/certified-operators-5cqq4" Jan 28 06:51:19 crc kubenswrapper[4776]: E0128 06:51:19.354239 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 06:51:19.854210282 +0000 UTC m=+51.269870442 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.371295 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jvjv2" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.438691 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2f2sb"] Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.445911 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-24xrp"] Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.449416 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24xrp" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.455288 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f691d5a6-7d36-4834-8844-ccd5c12b6645-catalog-content\") pod \"certified-operators-5cqq4\" (UID: \"f691d5a6-7d36-4834-8844-ccd5c12b6645\") " pod="openshift-marketplace/certified-operators-5cqq4" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.455361 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztnwv\" (UniqueName: \"kubernetes.io/projected/f691d5a6-7d36-4834-8844-ccd5c12b6645-kube-api-access-ztnwv\") pod \"certified-operators-5cqq4\" (UID: \"f691d5a6-7d36-4834-8844-ccd5c12b6645\") " pod="openshift-marketplace/certified-operators-5cqq4" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.455404 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.455431 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f691d5a6-7d36-4834-8844-ccd5c12b6645-utilities\") pod \"certified-operators-5cqq4\" (UID: \"f691d5a6-7d36-4834-8844-ccd5c12b6645\") " pod="openshift-marketplace/certified-operators-5cqq4" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.455846 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f691d5a6-7d36-4834-8844-ccd5c12b6645-catalog-content\") pod \"certified-operators-5cqq4\" (UID: \"f691d5a6-7d36-4834-8844-ccd5c12b6645\") " pod="openshift-marketplace/certified-operators-5cqq4" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.456189 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f691d5a6-7d36-4834-8844-ccd5c12b6645-utilities\") pod \"certified-operators-5cqq4\" (UID: \"f691d5a6-7d36-4834-8844-ccd5c12b6645\") " pod="openshift-marketplace/certified-operators-5cqq4" Jan 28 06:51:19 crc kubenswrapper[4776]: E0128 06:51:19.456598 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 06:51:19.956580077 +0000 UTC m=+51.372240237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vjkh5" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.462813 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-24xrp"] Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.463381 4776 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-28T06:51:19.023948598Z","Handler":null,"Name":""} Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.466875 4776 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.466925 4776 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.473577 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztnwv\" (UniqueName: \"kubernetes.io/projected/f691d5a6-7d36-4834-8844-ccd5c12b6645-kube-api-access-ztnwv\") pod \"certified-operators-5cqq4\" (UID: \"f691d5a6-7d36-4834-8844-ccd5c12b6645\") " pod="openshift-marketplace/certified-operators-5cqq4" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.552464 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5cqq4" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.557147 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.557458 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-catalog-content\") pod \"community-operators-24xrp\" (UID: \"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4\") " pod="openshift-marketplace/community-operators-24xrp" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.558703 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-utilities\") pod \"community-operators-24xrp\" (UID: \"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4\") " pod="openshift-marketplace/community-operators-24xrp" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.558821 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v6wp\" (UniqueName: \"kubernetes.io/projected/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-kube-api-access-2v6wp\") pod \"community-operators-24xrp\" (UID: \"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4\") " pod="openshift-marketplace/community-operators-24xrp" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.564706 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.622104 4776 patch_prober.go:28] interesting pod/router-default-5444994796-tcdcf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 06:51:19 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Jan 28 06:51:19 crc kubenswrapper[4776]: [+]process-running ok Jan 28 06:51:19 crc kubenswrapper[4776]: healthz check failed Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.622159 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcdcf" podUID="071d005d-cd96-4f28-b644-982b0f846135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.663902 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-utilities\") pod \"community-operators-24xrp\" (UID: \"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4\") " pod="openshift-marketplace/community-operators-24xrp" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.664056 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.664105 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v6wp\" (UniqueName: \"kubernetes.io/projected/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-kube-api-access-2v6wp\") pod \"community-operators-24xrp\" (UID: \"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4\") " pod="openshift-marketplace/community-operators-24xrp" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.664180 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-catalog-content\") pod \"community-operators-24xrp\" (UID: \"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4\") " pod="openshift-marketplace/community-operators-24xrp" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.664761 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-utilities\") pod \"community-operators-24xrp\" (UID: \"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4\") " pod="openshift-marketplace/community-operators-24xrp" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.665697 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-catalog-content\") pod \"community-operators-24xrp\" (UID: \"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4\") " pod="openshift-marketplace/community-operators-24xrp" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.669507 4776 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.669581 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:19 crc kubenswrapper[4776]: E0128 06:51:19.694645 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod784215d5_15f7_4ff3_b0b5_f176cc7b14b2.slice/crio-cdfd9b87a3a8ef32183db88d792a1c130b5e9456aaec004ffebd8ff4ddbc0595.scope\": RecentStats: unable to find data in memory cache]" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.695103 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v6wp\" (UniqueName: \"kubernetes.io/projected/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-kube-api-access-2v6wp\") pod \"community-operators-24xrp\" (UID: \"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4\") " pod="openshift-marketplace/community-operators-24xrp" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.700328 4776 generic.go:334] "Generic (PLEG): container finished" podID="784215d5-15f7-4ff3-b0b5-f176cc7b14b2" containerID="cdfd9b87a3a8ef32183db88d792a1c130b5e9456aaec004ffebd8ff4ddbc0595" exitCode=0 Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.700826 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" event={"ID":"784215d5-15f7-4ff3-b0b5-f176cc7b14b2","Type":"ContainerDied","Data":"cdfd9b87a3a8ef32183db88d792a1c130b5e9456aaec004ffebd8ff4ddbc0595"} Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.706614 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9","Type":"ContainerStarted","Data":"83dcc55edcbe7058a366107b1193c1a6cebef0030df5988b73a75b5dc2e4c52d"} Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.706674 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9","Type":"ContainerStarted","Data":"1e0df2ef2b195a5316edb0f56bdc7aff190165b1873d806d7875579eb1adc106"} Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.711538 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" event={"ID":"d26e3039-7460-49d9-8f89-637d57601639","Type":"ContainerStarted","Data":"02d66142152b9387aabbd43a49089adc19268e8f822c464ab96fdf1e09482af0"} Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.712654 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2f2sb" event={"ID":"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff","Type":"ContainerStarted","Data":"801989a52306d24cae6fc4a895ebbe2babc18cc552cadd9fce9198d8cdc0d542"} Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.712713 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2f2sb" event={"ID":"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff","Type":"ContainerStarted","Data":"275362d2724ba94350de6664cae52731b07115cee69466d8e23e3fdf97c0e84e"} Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.717927 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vjkh5\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.727024 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jvjv2"] Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.753665 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.768158 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24xrp" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.781792 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-4ndtq" podStartSLOduration=10.781777267 podStartE2EDuration="10.781777267s" podCreationTimestamp="2026-01-28 06:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:19.77573652 +0000 UTC m=+51.191396680" watchObservedRunningTime="2026-01-28 06:51:19.781777267 +0000 UTC m=+51.197437427" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.809180 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.809160858 podStartE2EDuration="1.809160858s" podCreationTimestamp="2026-01-28 06:51:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:19.807778445 +0000 UTC m=+51.223438605" watchObservedRunningTime="2026-01-28 06:51:19.809160858 +0000 UTC m=+51.224821018" Jan 28 06:51:19 crc kubenswrapper[4776]: I0128 06:51:19.867411 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5cqq4"] Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.038033 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vjkh5"] Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.073785 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-24xrp"] Jan 28 06:51:20 crc kubenswrapper[4776]: W0128 06:51:20.103846 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b5f9b9_549e_443e_9fc5_eb377698f57b.slice/crio-b1f14d6c56a8063e11ec0edf990bbc3f96aee7d3f5f3475483d7186738ae1e77 WatchSource:0}: Error finding container b1f14d6c56a8063e11ec0edf990bbc3f96aee7d3f5f3475483d7186738ae1e77: Status 404 returned error can't find the container with id b1f14d6c56a8063e11ec0edf990bbc3f96aee7d3f5f3475483d7186738ae1e77 Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.576167 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.576630 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.577720 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.582225 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.613519 4776 patch_prober.go:28] interesting pod/router-default-5444994796-tcdcf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 06:51:20 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Jan 28 06:51:20 crc kubenswrapper[4776]: [+]process-running ok Jan 28 06:51:20 crc kubenswrapper[4776]: healthz check failed Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.613634 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcdcf" podUID="071d005d-cd96-4f28-b644-982b0f846135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.677960 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.678086 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.682315 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.683582 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.727738 4776 generic.go:334] "Generic (PLEG): container finished" podID="a3f85efd-d5e1-45c6-9fa4-211ef9b477b4" containerID="8a9a2537aaeeb6cdc478165556ecefb51a338a5bb5ee97f01c40223dcbb6edbd" exitCode=0 Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.729715 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24xrp" event={"ID":"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4","Type":"ContainerDied","Data":"8a9a2537aaeeb6cdc478165556ecefb51a338a5bb5ee97f01c40223dcbb6edbd"} Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.729791 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24xrp" event={"ID":"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4","Type":"ContainerStarted","Data":"647033e02ce16eae4f3fa2ec8cbb214b41bcfa410bf2c853d5b71b9bfae07982"} Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.732338 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.736406 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" event={"ID":"57b5f9b9-549e-443e-9fc5-eb377698f57b","Type":"ContainerStarted","Data":"38cf452848bbdc39b20e13a8fdb6a2287fd25915cb2826b6844eea83048b826e"} Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.737677 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" event={"ID":"57b5f9b9-549e-443e-9fc5-eb377698f57b","Type":"ContainerStarted","Data":"b1f14d6c56a8063e11ec0edf990bbc3f96aee7d3f5f3475483d7186738ae1e77"} Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.737723 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.739316 4776 generic.go:334] "Generic (PLEG): container finished" podID="c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9" containerID="83dcc55edcbe7058a366107b1193c1a6cebef0030df5988b73a75b5dc2e4c52d" exitCode=0 Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.739480 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9","Type":"ContainerDied","Data":"83dcc55edcbe7058a366107b1193c1a6cebef0030df5988b73a75b5dc2e4c52d"} Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.749277 4776 generic.go:334] "Generic (PLEG): container finished" podID="f691d5a6-7d36-4834-8844-ccd5c12b6645" containerID="20625a91add363a943887532bb5de9373dd0a889d3189fea1d6c5cd7adb91af4" exitCode=0 Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.749404 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cqq4" event={"ID":"f691d5a6-7d36-4834-8844-ccd5c12b6645","Type":"ContainerDied","Data":"20625a91add363a943887532bb5de9373dd0a889d3189fea1d6c5cd7adb91af4"} Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.749467 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cqq4" event={"ID":"f691d5a6-7d36-4834-8844-ccd5c12b6645","Type":"ContainerStarted","Data":"f2678e37ab6826fe251f57f31516186d86ce434a851bb0255bfd86550e41c53e"} Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.754403 4776 generic.go:334] "Generic (PLEG): container finished" podID="8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff" containerID="801989a52306d24cae6fc4a895ebbe2babc18cc552cadd9fce9198d8cdc0d542" exitCode=0 Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.754462 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2f2sb" event={"ID":"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff","Type":"ContainerDied","Data":"801989a52306d24cae6fc4a895ebbe2babc18cc552cadd9fce9198d8cdc0d542"} Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.758264 4776 generic.go:334] "Generic (PLEG): container finished" podID="beb166aa-d9c2-4658-af43-8d5d2eb61588" containerID="b375e2f6050d3c336d459787e66ea2e98ec81e4156d628ef0794c49d4f7f8ad0" exitCode=0 Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.758357 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvjv2" event={"ID":"beb166aa-d9c2-4658-af43-8d5d2eb61588","Type":"ContainerDied","Data":"b375e2f6050d3c336d459787e66ea2e98ec81e4156d628ef0794c49d4f7f8ad0"} Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.758389 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvjv2" event={"ID":"beb166aa-d9c2-4658-af43-8d5d2eb61588","Type":"ContainerStarted","Data":"89f939bc81de658ffc0709ec512f212344c350aa2b022d81a3abf4b037b4e9c3"} Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.818854 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.831647 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.886985 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" podStartSLOduration=31.88696312 podStartE2EDuration="31.88696312s" podCreationTimestamp="2026-01-28 06:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:20.818850302 +0000 UTC m=+52.234510462" watchObservedRunningTime="2026-01-28 06:51:20.88696312 +0000 UTC m=+52.302623280" Jan 28 06:51:20 crc kubenswrapper[4776]: I0128 06:51:20.954241 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.037846 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ndmtj"] Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.039385 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndmtj" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.043888 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.053047 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndmtj"] Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.083700 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzvrs\" (UniqueName: \"kubernetes.io/projected/d608fa02-5844-4167-831f-c754aeca5050-kube-api-access-fzvrs\") pod \"redhat-marketplace-ndmtj\" (UID: \"d608fa02-5844-4167-831f-c754aeca5050\") " pod="openshift-marketplace/redhat-marketplace-ndmtj" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.083773 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d608fa02-5844-4167-831f-c754aeca5050-catalog-content\") pod \"redhat-marketplace-ndmtj\" (UID: \"d608fa02-5844-4167-831f-c754aeca5050\") " pod="openshift-marketplace/redhat-marketplace-ndmtj" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.083826 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d608fa02-5844-4167-831f-c754aeca5050-utilities\") pod \"redhat-marketplace-ndmtj\" (UID: \"d608fa02-5844-4167-831f-c754aeca5050\") " pod="openshift-marketplace/redhat-marketplace-ndmtj" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.094867 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.184315 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-secret-volume\") pod \"784215d5-15f7-4ff3-b0b5-f176cc7b14b2\" (UID: \"784215d5-15f7-4ff3-b0b5-f176cc7b14b2\") " Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.184380 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr7q7\" (UniqueName: \"kubernetes.io/projected/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-kube-api-access-fr7q7\") pod \"784215d5-15f7-4ff3-b0b5-f176cc7b14b2\" (UID: \"784215d5-15f7-4ff3-b0b5-f176cc7b14b2\") " Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.184509 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-config-volume\") pod \"784215d5-15f7-4ff3-b0b5-f176cc7b14b2\" (UID: \"784215d5-15f7-4ff3-b0b5-f176cc7b14b2\") " Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.184687 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzvrs\" (UniqueName: \"kubernetes.io/projected/d608fa02-5844-4167-831f-c754aeca5050-kube-api-access-fzvrs\") pod \"redhat-marketplace-ndmtj\" (UID: \"d608fa02-5844-4167-831f-c754aeca5050\") " pod="openshift-marketplace/redhat-marketplace-ndmtj" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.184732 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d608fa02-5844-4167-831f-c754aeca5050-catalog-content\") pod \"redhat-marketplace-ndmtj\" (UID: \"d608fa02-5844-4167-831f-c754aeca5050\") " pod="openshift-marketplace/redhat-marketplace-ndmtj" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.184772 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d608fa02-5844-4167-831f-c754aeca5050-utilities\") pod \"redhat-marketplace-ndmtj\" (UID: \"d608fa02-5844-4167-831f-c754aeca5050\") " pod="openshift-marketplace/redhat-marketplace-ndmtj" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.185271 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-config-volume" (OuterVolumeSpecName: "config-volume") pod "784215d5-15f7-4ff3-b0b5-f176cc7b14b2" (UID: "784215d5-15f7-4ff3-b0b5-f176cc7b14b2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.185826 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d608fa02-5844-4167-831f-c754aeca5050-utilities\") pod \"redhat-marketplace-ndmtj\" (UID: \"d608fa02-5844-4167-831f-c754aeca5050\") " pod="openshift-marketplace/redhat-marketplace-ndmtj" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.185881 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d608fa02-5844-4167-831f-c754aeca5050-catalog-content\") pod \"redhat-marketplace-ndmtj\" (UID: \"d608fa02-5844-4167-831f-c754aeca5050\") " pod="openshift-marketplace/redhat-marketplace-ndmtj" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.191210 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-kube-api-access-fr7q7" (OuterVolumeSpecName: "kube-api-access-fr7q7") pod "784215d5-15f7-4ff3-b0b5-f176cc7b14b2" (UID: "784215d5-15f7-4ff3-b0b5-f176cc7b14b2"). InnerVolumeSpecName "kube-api-access-fr7q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.191636 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "784215d5-15f7-4ff3-b0b5-f176cc7b14b2" (UID: "784215d5-15f7-4ff3-b0b5-f176cc7b14b2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.201433 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzvrs\" (UniqueName: \"kubernetes.io/projected/d608fa02-5844-4167-831f-c754aeca5050-kube-api-access-fzvrs\") pod \"redhat-marketplace-ndmtj\" (UID: \"d608fa02-5844-4167-831f-c754aeca5050\") " pod="openshift-marketplace/redhat-marketplace-ndmtj" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.285616 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.285654 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.285667 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr7q7\" (UniqueName: \"kubernetes.io/projected/784215d5-15f7-4ff3-b0b5-f176cc7b14b2-kube-api-access-fr7q7\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.311461 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.366532 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndmtj" Jan 28 06:51:21 crc kubenswrapper[4776]: W0128 06:51:21.384119 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-1658182100b408fd6a3d178338dc90413513869e4a7f4f1337589f0539f65776 WatchSource:0}: Error finding container 1658182100b408fd6a3d178338dc90413513869e4a7f4f1337589f0539f65776: Status 404 returned error can't find the container with id 1658182100b408fd6a3d178338dc90413513869e4a7f4f1337589f0539f65776 Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.436614 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-482wz"] Jan 28 06:51:21 crc kubenswrapper[4776]: E0128 06:51:21.437183 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784215d5-15f7-4ff3-b0b5-f176cc7b14b2" containerName="collect-profiles" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.437199 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="784215d5-15f7-4ff3-b0b5-f176cc7b14b2" containerName="collect-profiles" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.437291 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="784215d5-15f7-4ff3-b0b5-f176cc7b14b2" containerName="collect-profiles" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.438027 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-482wz" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.461402 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-482wz"] Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.487429 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-675ck\" (UniqueName: \"kubernetes.io/projected/f79a920c-ccfe-464a-afe3-26d89327d4d9-kube-api-access-675ck\") pod \"redhat-marketplace-482wz\" (UID: \"f79a920c-ccfe-464a-afe3-26d89327d4d9\") " pod="openshift-marketplace/redhat-marketplace-482wz" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.487530 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f79a920c-ccfe-464a-afe3-26d89327d4d9-utilities\") pod \"redhat-marketplace-482wz\" (UID: \"f79a920c-ccfe-464a-afe3-26d89327d4d9\") " pod="openshift-marketplace/redhat-marketplace-482wz" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.487570 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f79a920c-ccfe-464a-afe3-26d89327d4d9-catalog-content\") pod \"redhat-marketplace-482wz\" (UID: \"f79a920c-ccfe-464a-afe3-26d89327d4d9\") " pod="openshift-marketplace/redhat-marketplace-482wz" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.589796 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-675ck\" (UniqueName: \"kubernetes.io/projected/f79a920c-ccfe-464a-afe3-26d89327d4d9-kube-api-access-675ck\") pod \"redhat-marketplace-482wz\" (UID: \"f79a920c-ccfe-464a-afe3-26d89327d4d9\") " pod="openshift-marketplace/redhat-marketplace-482wz" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.589938 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f79a920c-ccfe-464a-afe3-26d89327d4d9-utilities\") pod \"redhat-marketplace-482wz\" (UID: \"f79a920c-ccfe-464a-afe3-26d89327d4d9\") " pod="openshift-marketplace/redhat-marketplace-482wz" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.589958 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f79a920c-ccfe-464a-afe3-26d89327d4d9-catalog-content\") pod \"redhat-marketplace-482wz\" (UID: \"f79a920c-ccfe-464a-afe3-26d89327d4d9\") " pod="openshift-marketplace/redhat-marketplace-482wz" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.590954 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f79a920c-ccfe-464a-afe3-26d89327d4d9-utilities\") pod \"redhat-marketplace-482wz\" (UID: \"f79a920c-ccfe-464a-afe3-26d89327d4d9\") " pod="openshift-marketplace/redhat-marketplace-482wz" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.591014 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f79a920c-ccfe-464a-afe3-26d89327d4d9-catalog-content\") pod \"redhat-marketplace-482wz\" (UID: \"f79a920c-ccfe-464a-afe3-26d89327d4d9\") " pod="openshift-marketplace/redhat-marketplace-482wz" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.605422 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndmtj"] Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.613352 4776 patch_prober.go:28] interesting pod/router-default-5444994796-tcdcf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 06:51:21 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Jan 28 06:51:21 crc kubenswrapper[4776]: [+]process-running ok Jan 28 06:51:21 crc kubenswrapper[4776]: healthz check failed Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.613745 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcdcf" podUID="071d005d-cd96-4f28-b644-982b0f846135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.622061 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-675ck\" (UniqueName: \"kubernetes.io/projected/f79a920c-ccfe-464a-afe3-26d89327d4d9-kube-api-access-675ck\") pod \"redhat-marketplace-482wz\" (UID: \"f79a920c-ccfe-464a-afe3-26d89327d4d9\") " pod="openshift-marketplace/redhat-marketplace-482wz" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.760985 4776 patch_prober.go:28] interesting pod/downloads-7954f5f757-wjw44 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.761036 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wjw44" podUID="53ea92d3-1ca4-4663-9a90-c9cb24c6bec1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.761324 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-482wz" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.762091 4776 patch_prober.go:28] interesting pod/downloads-7954f5f757-wjw44 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.762113 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wjw44" podUID="53ea92d3-1ca4-4663-9a90-c9cb24c6bec1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.788417 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"acf3dfb9755755638ffd6e948567cf1c39ebb83126dadc34cc9a251c0a909cd6"} Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.788483 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1658182100b408fd6a3d178338dc90413513869e4a7f4f1337589f0539f65776"} Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.788735 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.791290 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndmtj" event={"ID":"d608fa02-5844-4167-831f-c754aeca5050","Type":"ContainerStarted","Data":"f3359a69575e229fbae06fee69f5bdc22c8708ae9efa2a6124a07fa07bfade0a"} Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.791348 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndmtj" event={"ID":"d608fa02-5844-4167-831f-c754aeca5050","Type":"ContainerStarted","Data":"1d31c5bd5ba06be90a7d4575a573a56015faedd2b7c9dfe1c2c5d0e8886738a5"} Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.794331 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"67579199d88a14d44324c6a0526807b58b6ecdf44561f7c279b83d603455b782"} Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.794376 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c55c73f739b77315002415e24f1bd259d6d626ac64de7cee269b69baebd73378"} Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.801614 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"91127ca915f14d9524781ed97a938f04731a62a9b702b52ef0b39db55b7515f5"} Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.801818 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ef856443850834e58c91865e43a8cdcd8e16c244e482c955a77ac1e2a9df9a0d"} Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.813577 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" event={"ID":"784215d5-15f7-4ff3-b0b5-f176cc7b14b2","Type":"ContainerDied","Data":"d6a88a19616096f3f68e3d54274c7f09f1d080f23d360e51b3bdb7460990d66b"} Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.813635 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6a88a19616096f3f68e3d54274c7f09f1d080f23d360e51b3bdb7460990d66b" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.813617 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.855058 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.855101 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.870884 4776 patch_prober.go:28] interesting pod/console-f9d7485db-p5b6p container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.870942 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-p5b6p" podUID="a6e763c5-5d99-4374-9ade-5ac3ff4b9817" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.916772 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.932710 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qxtd" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.966704 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:21 crc kubenswrapper[4776]: I0128 06:51:21.967850 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.026654 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.041071 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8d9l4"] Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.042764 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8d9l4" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.050432 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.051954 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8d9l4"] Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.106720 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51afc3ef-b111-4228-859a-9ff98f2b5131-utilities\") pod \"redhat-operators-8d9l4\" (UID: \"51afc3ef-b111-4228-859a-9ff98f2b5131\") " pod="openshift-marketplace/redhat-operators-8d9l4" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.106810 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51afc3ef-b111-4228-859a-9ff98f2b5131-catalog-content\") pod \"redhat-operators-8d9l4\" (UID: \"51afc3ef-b111-4228-859a-9ff98f2b5131\") " pod="openshift-marketplace/redhat-operators-8d9l4" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.106873 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8dhg\" (UniqueName: \"kubernetes.io/projected/51afc3ef-b111-4228-859a-9ff98f2b5131-kube-api-access-r8dhg\") pod \"redhat-operators-8d9l4\" (UID: \"51afc3ef-b111-4228-859a-9ff98f2b5131\") " pod="openshift-marketplace/redhat-operators-8d9l4" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.140249 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.210851 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51afc3ef-b111-4228-859a-9ff98f2b5131-catalog-content\") pod \"redhat-operators-8d9l4\" (UID: \"51afc3ef-b111-4228-859a-9ff98f2b5131\") " pod="openshift-marketplace/redhat-operators-8d9l4" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.210939 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8dhg\" (UniqueName: \"kubernetes.io/projected/51afc3ef-b111-4228-859a-9ff98f2b5131-kube-api-access-r8dhg\") pod \"redhat-operators-8d9l4\" (UID: \"51afc3ef-b111-4228-859a-9ff98f2b5131\") " pod="openshift-marketplace/redhat-operators-8d9l4" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.211010 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51afc3ef-b111-4228-859a-9ff98f2b5131-utilities\") pod \"redhat-operators-8d9l4\" (UID: \"51afc3ef-b111-4228-859a-9ff98f2b5131\") " pod="openshift-marketplace/redhat-operators-8d9l4" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.211866 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51afc3ef-b111-4228-859a-9ff98f2b5131-utilities\") pod \"redhat-operators-8d9l4\" (UID: \"51afc3ef-b111-4228-859a-9ff98f2b5131\") " pod="openshift-marketplace/redhat-operators-8d9l4" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.211917 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51afc3ef-b111-4228-859a-9ff98f2b5131-catalog-content\") pod \"redhat-operators-8d9l4\" (UID: \"51afc3ef-b111-4228-859a-9ff98f2b5131\") " pod="openshift-marketplace/redhat-operators-8d9l4" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.263592 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8dhg\" (UniqueName: \"kubernetes.io/projected/51afc3ef-b111-4228-859a-9ff98f2b5131-kube-api-access-r8dhg\") pod \"redhat-operators-8d9l4\" (UID: \"51afc3ef-b111-4228-859a-9ff98f2b5131\") " pod="openshift-marketplace/redhat-operators-8d9l4" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.263889 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.311962 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9-kubelet-dir\") pod \"c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9\" (UID: \"c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9\") " Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.312045 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9-kube-api-access\") pod \"c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9\" (UID: \"c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9\") " Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.312082 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9" (UID: "c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.312241 4776 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.315770 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9" (UID: "c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.389064 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8d9l4" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.414034 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.437107 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lmx8d"] Jan 28 06:51:22 crc kubenswrapper[4776]: E0128 06:51:22.437346 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9" containerName="pruner" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.437366 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9" containerName="pruner" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.437504 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9" containerName="pruner" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.438376 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmx8d" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.449080 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lmx8d"] Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.489713 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-482wz"] Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.515302 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40686731-ff76-403c-bbed-20ceaa786fbc-catalog-content\") pod \"redhat-operators-lmx8d\" (UID: \"40686731-ff76-403c-bbed-20ceaa786fbc\") " pod="openshift-marketplace/redhat-operators-lmx8d" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.515366 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52zpk\" (UniqueName: \"kubernetes.io/projected/40686731-ff76-403c-bbed-20ceaa786fbc-kube-api-access-52zpk\") pod \"redhat-operators-lmx8d\" (UID: \"40686731-ff76-403c-bbed-20ceaa786fbc\") " pod="openshift-marketplace/redhat-operators-lmx8d" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.515603 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40686731-ff76-403c-bbed-20ceaa786fbc-utilities\") pod \"redhat-operators-lmx8d\" (UID: \"40686731-ff76-403c-bbed-20ceaa786fbc\") " pod="openshift-marketplace/redhat-operators-lmx8d" Jan 28 06:51:22 crc kubenswrapper[4776]: W0128 06:51:22.524458 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf79a920c_ccfe_464a_afe3_26d89327d4d9.slice/crio-16f950981f35365962dc37c9985338d1d81eb6574798d7f702c538569b9fa218 WatchSource:0}: Error finding container 16f950981f35365962dc37c9985338d1d81eb6574798d7f702c538569b9fa218: Status 404 returned error can't find the container with id 16f950981f35365962dc37c9985338d1d81eb6574798d7f702c538569b9fa218 Jan 28 06:51:22 crc kubenswrapper[4776]: E0128 06:51:22.529923 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 06:51:22 crc kubenswrapper[4776]: E0128 06:51:22.535396 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.611830 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.617799 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40686731-ff76-403c-bbed-20ceaa786fbc-utilities\") pod \"redhat-operators-lmx8d\" (UID: \"40686731-ff76-403c-bbed-20ceaa786fbc\") " pod="openshift-marketplace/redhat-operators-lmx8d" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.618361 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40686731-ff76-403c-bbed-20ceaa786fbc-catalog-content\") pod \"redhat-operators-lmx8d\" (UID: \"40686731-ff76-403c-bbed-20ceaa786fbc\") " pod="openshift-marketplace/redhat-operators-lmx8d" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.618408 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52zpk\" (UniqueName: \"kubernetes.io/projected/40686731-ff76-403c-bbed-20ceaa786fbc-kube-api-access-52zpk\") pod \"redhat-operators-lmx8d\" (UID: \"40686731-ff76-403c-bbed-20ceaa786fbc\") " pod="openshift-marketplace/redhat-operators-lmx8d" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.618594 4776 patch_prober.go:28] interesting pod/router-default-5444994796-tcdcf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 06:51:22 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Jan 28 06:51:22 crc kubenswrapper[4776]: [+]process-running ok Jan 28 06:51:22 crc kubenswrapper[4776]: healthz check failed Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.618668 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcdcf" podUID="071d005d-cd96-4f28-b644-982b0f846135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.618769 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40686731-ff76-403c-bbed-20ceaa786fbc-utilities\") pod \"redhat-operators-lmx8d\" (UID: \"40686731-ff76-403c-bbed-20ceaa786fbc\") " pod="openshift-marketplace/redhat-operators-lmx8d" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.618862 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40686731-ff76-403c-bbed-20ceaa786fbc-catalog-content\") pod \"redhat-operators-lmx8d\" (UID: \"40686731-ff76-403c-bbed-20ceaa786fbc\") " pod="openshift-marketplace/redhat-operators-lmx8d" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.667965 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52zpk\" (UniqueName: \"kubernetes.io/projected/40686731-ff76-403c-bbed-20ceaa786fbc-kube-api-access-52zpk\") pod \"redhat-operators-lmx8d\" (UID: \"40686731-ff76-403c-bbed-20ceaa786fbc\") " pod="openshift-marketplace/redhat-operators-lmx8d" Jan 28 06:51:22 crc kubenswrapper[4776]: E0128 06:51:22.693903 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 06:51:22 crc kubenswrapper[4776]: E0128 06:51:22.693990 4776 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" podUID="5a30af5d-0b53-4d54-af2a-2a4d5a296e6a" containerName="kube-multus-additional-cni-plugins" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.769009 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmx8d" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.832580 4776 generic.go:334] "Generic (PLEG): container finished" podID="d608fa02-5844-4167-831f-c754aeca5050" containerID="f3359a69575e229fbae06fee69f5bdc22c8708ae9efa2a6124a07fa07bfade0a" exitCode=0 Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.832686 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndmtj" event={"ID":"d608fa02-5844-4167-831f-c754aeca5050","Type":"ContainerDied","Data":"f3359a69575e229fbae06fee69f5bdc22c8708ae9efa2a6124a07fa07bfade0a"} Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.837149 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-482wz" event={"ID":"f79a920c-ccfe-464a-afe3-26d89327d4d9","Type":"ContainerDied","Data":"86e2a3c8066a8c300ecba7b2332b09bc5b2e4a4fd353086a9cc1aa23d736c950"} Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.836751 4776 generic.go:334] "Generic (PLEG): container finished" podID="f79a920c-ccfe-464a-afe3-26d89327d4d9" containerID="86e2a3c8066a8c300ecba7b2332b09bc5b2e4a4fd353086a9cc1aa23d736c950" exitCode=0 Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.837606 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-482wz" event={"ID":"f79a920c-ccfe-464a-afe3-26d89327d4d9","Type":"ContainerStarted","Data":"16f950981f35365962dc37c9985338d1d81eb6574798d7f702c538569b9fa218"} Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.839768 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.843815 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c1c9ea8e-e2fb-4ec9-84d8-b64d2770bdd9","Type":"ContainerDied","Data":"1e0df2ef2b195a5316edb0f56bdc7aff190165b1873d806d7875579eb1adc106"} Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.843853 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e0df2ef2b195a5316edb0f56bdc7aff190165b1873d806d7875579eb1adc106" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.851039 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-8ddhh" Jan 28 06:51:22 crc kubenswrapper[4776]: I0128 06:51:22.920413 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8d9l4"] Jan 28 06:51:23 crc kubenswrapper[4776]: I0128 06:51:23.149064 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lmx8d"] Jan 28 06:51:23 crc kubenswrapper[4776]: I0128 06:51:23.613774 4776 patch_prober.go:28] interesting pod/router-default-5444994796-tcdcf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 06:51:23 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Jan 28 06:51:23 crc kubenswrapper[4776]: [+]process-running ok Jan 28 06:51:23 crc kubenswrapper[4776]: healthz check failed Jan 28 06:51:23 crc kubenswrapper[4776]: I0128 06:51:23.613839 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcdcf" podUID="071d005d-cd96-4f28-b644-982b0f846135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 06:51:23 crc kubenswrapper[4776]: I0128 06:51:23.864520 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 28 06:51:23 crc kubenswrapper[4776]: I0128 06:51:23.866393 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 06:51:23 crc kubenswrapper[4776]: I0128 06:51:23.885113 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 28 06:51:23 crc kubenswrapper[4776]: I0128 06:51:23.885350 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 28 06:51:23 crc kubenswrapper[4776]: I0128 06:51:23.889250 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 28 06:51:23 crc kubenswrapper[4776]: I0128 06:51:23.895504 4776 generic.go:334] "Generic (PLEG): container finished" podID="51afc3ef-b111-4228-859a-9ff98f2b5131" containerID="54caddc4c684cc2b6de0e2157292e4f00fab4ade2a198da83a99fba7fc3dd90f" exitCode=0 Jan 28 06:51:23 crc kubenswrapper[4776]: I0128 06:51:23.895812 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8d9l4" event={"ID":"51afc3ef-b111-4228-859a-9ff98f2b5131","Type":"ContainerDied","Data":"54caddc4c684cc2b6de0e2157292e4f00fab4ade2a198da83a99fba7fc3dd90f"} Jan 28 06:51:23 crc kubenswrapper[4776]: I0128 06:51:23.895842 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8d9l4" event={"ID":"51afc3ef-b111-4228-859a-9ff98f2b5131","Type":"ContainerStarted","Data":"b13455baf44ecc8d51239ba3041d7566bcfff0e003db5526d53cdfa856142698"} Jan 28 06:51:23 crc kubenswrapper[4776]: I0128 06:51:23.908952 4776 generic.go:334] "Generic (PLEG): container finished" podID="40686731-ff76-403c-bbed-20ceaa786fbc" containerID="b560bbe2547c35d355e73467ebfecb0206768604eed88226f9108ee6a18be2a1" exitCode=0 Jan 28 06:51:23 crc kubenswrapper[4776]: I0128 06:51:23.909067 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmx8d" event={"ID":"40686731-ff76-403c-bbed-20ceaa786fbc","Type":"ContainerDied","Data":"b560bbe2547c35d355e73467ebfecb0206768604eed88226f9108ee6a18be2a1"} Jan 28 06:51:23 crc kubenswrapper[4776]: I0128 06:51:23.909292 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmx8d" event={"ID":"40686731-ff76-403c-bbed-20ceaa786fbc","Type":"ContainerStarted","Data":"721fed48d452dae562c1e4f719a33c02100a305c31c43862e0d3919720baff6c"} Jan 28 06:51:24 crc kubenswrapper[4776]: I0128 06:51:24.043917 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/39bb7fcf-5073-4563-a338-2f4048e1b29d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"39bb7fcf-5073-4563-a338-2f4048e1b29d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 06:51:24 crc kubenswrapper[4776]: I0128 06:51:24.044659 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39bb7fcf-5073-4563-a338-2f4048e1b29d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"39bb7fcf-5073-4563-a338-2f4048e1b29d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 06:51:24 crc kubenswrapper[4776]: I0128 06:51:24.146710 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/39bb7fcf-5073-4563-a338-2f4048e1b29d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"39bb7fcf-5073-4563-a338-2f4048e1b29d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 06:51:24 crc kubenswrapper[4776]: I0128 06:51:24.146773 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39bb7fcf-5073-4563-a338-2f4048e1b29d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"39bb7fcf-5073-4563-a338-2f4048e1b29d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 06:51:24 crc kubenswrapper[4776]: I0128 06:51:24.146819 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/39bb7fcf-5073-4563-a338-2f4048e1b29d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"39bb7fcf-5073-4563-a338-2f4048e1b29d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 06:51:24 crc kubenswrapper[4776]: I0128 06:51:24.167312 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39bb7fcf-5073-4563-a338-2f4048e1b29d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"39bb7fcf-5073-4563-a338-2f4048e1b29d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 06:51:24 crc kubenswrapper[4776]: I0128 06:51:24.206746 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 06:51:24 crc kubenswrapper[4776]: I0128 06:51:24.612533 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:24 crc kubenswrapper[4776]: I0128 06:51:24.617978 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-tcdcf" Jan 28 06:51:24 crc kubenswrapper[4776]: I0128 06:51:24.916784 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 28 06:51:25 crc kubenswrapper[4776]: I0128 06:51:25.062810 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 06:51:25 crc kubenswrapper[4776]: I0128 06:51:25.079146 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 28 06:51:25 crc kubenswrapper[4776]: I0128 06:51:25.942433 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"39bb7fcf-5073-4563-a338-2f4048e1b29d","Type":"ContainerStarted","Data":"9b15fc28488bece3a587b2a15bc7a79125fccc288ed4f3c51118ebafcee3df88"} Jan 28 06:51:26 crc kubenswrapper[4776]: I0128 06:51:26.957067 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"39bb7fcf-5073-4563-a338-2f4048e1b29d","Type":"ContainerStarted","Data":"35c43e7aec6c4096bd02bf713f5eae5e9570c071fc97be3ab4027f189b899ba8"} Jan 28 06:51:26 crc kubenswrapper[4776]: I0128 06:51:26.980503 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.980486861 podStartE2EDuration="3.980486861s" podCreationTimestamp="2026-01-28 06:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:26.970687601 +0000 UTC m=+58.386347771" watchObservedRunningTime="2026-01-28 06:51:26.980486861 +0000 UTC m=+58.396147021" Jan 28 06:51:26 crc kubenswrapper[4776]: I0128 06:51:26.988495 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.988480607 podStartE2EDuration="1.988480607s" podCreationTimestamp="2026-01-28 06:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:26.986713843 +0000 UTC m=+58.402374003" watchObservedRunningTime="2026-01-28 06:51:26.988480607 +0000 UTC m=+58.404140767" Jan 28 06:51:27 crc kubenswrapper[4776]: I0128 06:51:27.520180 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pkbgp" Jan 28 06:51:27 crc kubenswrapper[4776]: I0128 06:51:27.985384 4776 generic.go:334] "Generic (PLEG): container finished" podID="39bb7fcf-5073-4563-a338-2f4048e1b29d" containerID="35c43e7aec6c4096bd02bf713f5eae5e9570c071fc97be3ab4027f189b899ba8" exitCode=0 Jan 28 06:51:27 crc kubenswrapper[4776]: I0128 06:51:27.985436 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"39bb7fcf-5073-4563-a338-2f4048e1b29d","Type":"ContainerDied","Data":"35c43e7aec6c4096bd02bf713f5eae5e9570c071fc97be3ab4027f189b899ba8"} Jan 28 06:51:31 crc kubenswrapper[4776]: I0128 06:51:31.752048 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-wjw44" Jan 28 06:51:31 crc kubenswrapper[4776]: I0128 06:51:31.871025 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:31 crc kubenswrapper[4776]: I0128 06:51:31.875049 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 06:51:32 crc kubenswrapper[4776]: E0128 06:51:32.531584 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 06:51:32 crc kubenswrapper[4776]: E0128 06:51:32.541431 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 06:51:32 crc kubenswrapper[4776]: E0128 06:51:32.548263 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 06:51:32 crc kubenswrapper[4776]: E0128 06:51:32.548368 4776 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" podUID="5a30af5d-0b53-4d54-af2a-2a4d5a296e6a" containerName="kube-multus-additional-cni-plugins" Jan 28 06:51:33 crc kubenswrapper[4776]: I0128 06:51:33.599941 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9fmw6"] Jan 28 06:51:33 crc kubenswrapper[4776]: I0128 06:51:33.600261 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" podUID="be5bb707-a7a1-4b88-a75e-0093c14a4764" containerName="controller-manager" containerID="cri-o://46dc529a83d924236bf0101d580a2a32093a154ee95f9a222019c276d5d7eb15" gracePeriod=30 Jan 28 06:51:33 crc kubenswrapper[4776]: I0128 06:51:33.691082 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b"] Jan 28 06:51:33 crc kubenswrapper[4776]: I0128 06:51:33.691470 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" podUID="f1a25b7e-31ff-40e1-9aa0-07ef01b6333b" containerName="route-controller-manager" containerID="cri-o://521670f1b900680a57e51c2be6942d15b588053f6d0e2c6fe35212772d88f820" gracePeriod=30 Jan 28 06:51:34 crc kubenswrapper[4776]: I0128 06:51:34.037982 4776 generic.go:334] "Generic (PLEG): container finished" podID="be5bb707-a7a1-4b88-a75e-0093c14a4764" containerID="46dc529a83d924236bf0101d580a2a32093a154ee95f9a222019c276d5d7eb15" exitCode=0 Jan 28 06:51:34 crc kubenswrapper[4776]: I0128 06:51:34.038073 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" event={"ID":"be5bb707-a7a1-4b88-a75e-0093c14a4764","Type":"ContainerDied","Data":"46dc529a83d924236bf0101d580a2a32093a154ee95f9a222019c276d5d7eb15"} Jan 28 06:51:34 crc kubenswrapper[4776]: I0128 06:51:34.040603 4776 generic.go:334] "Generic (PLEG): container finished" podID="f1a25b7e-31ff-40e1-9aa0-07ef01b6333b" containerID="521670f1b900680a57e51c2be6942d15b588053f6d0e2c6fe35212772d88f820" exitCode=0 Jan 28 06:51:34 crc kubenswrapper[4776]: I0128 06:51:34.040689 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" event={"ID":"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b","Type":"ContainerDied","Data":"521670f1b900680a57e51c2be6942d15b588053f6d0e2c6fe35212772d88f820"} Jan 28 06:51:36 crc kubenswrapper[4776]: I0128 06:51:36.488765 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 06:51:36 crc kubenswrapper[4776]: I0128 06:51:36.587949 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39bb7fcf-5073-4563-a338-2f4048e1b29d-kube-api-access\") pod \"39bb7fcf-5073-4563-a338-2f4048e1b29d\" (UID: \"39bb7fcf-5073-4563-a338-2f4048e1b29d\") " Jan 28 06:51:36 crc kubenswrapper[4776]: I0128 06:51:36.588007 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/39bb7fcf-5073-4563-a338-2f4048e1b29d-kubelet-dir\") pod \"39bb7fcf-5073-4563-a338-2f4048e1b29d\" (UID: \"39bb7fcf-5073-4563-a338-2f4048e1b29d\") " Jan 28 06:51:36 crc kubenswrapper[4776]: I0128 06:51:36.588259 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/39bb7fcf-5073-4563-a338-2f4048e1b29d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "39bb7fcf-5073-4563-a338-2f4048e1b29d" (UID: "39bb7fcf-5073-4563-a338-2f4048e1b29d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 06:51:36 crc kubenswrapper[4776]: I0128 06:51:36.594365 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39bb7fcf-5073-4563-a338-2f4048e1b29d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "39bb7fcf-5073-4563-a338-2f4048e1b29d" (UID: "39bb7fcf-5073-4563-a338-2f4048e1b29d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:51:36 crc kubenswrapper[4776]: I0128 06:51:36.689725 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39bb7fcf-5073-4563-a338-2f4048e1b29d-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:36 crc kubenswrapper[4776]: I0128 06:51:36.689969 4776 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/39bb7fcf-5073-4563-a338-2f4048e1b29d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:37 crc kubenswrapper[4776]: I0128 06:51:37.084152 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"39bb7fcf-5073-4563-a338-2f4048e1b29d","Type":"ContainerDied","Data":"9b15fc28488bece3a587b2a15bc7a79125fccc288ed4f3c51118ebafcee3df88"} Jan 28 06:51:37 crc kubenswrapper[4776]: I0128 06:51:37.084214 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b15fc28488bece3a587b2a15bc7a79125fccc288ed4f3c51118ebafcee3df88" Jan 28 06:51:37 crc kubenswrapper[4776]: I0128 06:51:37.084253 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 06:51:39 crc kubenswrapper[4776]: I0128 06:51:39.759949 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.608238 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.641494 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m"] Jan 28 06:51:40 crc kubenswrapper[4776]: E0128 06:51:40.642241 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a25b7e-31ff-40e1-9aa0-07ef01b6333b" containerName="route-controller-manager" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.642342 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a25b7e-31ff-40e1-9aa0-07ef01b6333b" containerName="route-controller-manager" Jan 28 06:51:40 crc kubenswrapper[4776]: E0128 06:51:40.642427 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39bb7fcf-5073-4563-a338-2f4048e1b29d" containerName="pruner" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.642491 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="39bb7fcf-5073-4563-a338-2f4048e1b29d" containerName="pruner" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.642665 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="39bb7fcf-5073-4563-a338-2f4048e1b29d" containerName="pruner" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.642736 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a25b7e-31ff-40e1-9aa0-07ef01b6333b" containerName="route-controller-manager" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.643321 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.651530 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m"] Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.753242 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngs2k\" (UniqueName: \"kubernetes.io/projected/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-kube-api-access-ngs2k\") pod \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\" (UID: \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\") " Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.753479 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-client-ca\") pod \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\" (UID: \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\") " Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.753572 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-serving-cert\") pod \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\" (UID: \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\") " Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.753939 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-config\") pod \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\" (UID: \"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b\") " Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.754367 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-client-ca\") pod \"route-controller-manager-7d79786988-7xh9m\" (UID: \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\") " pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.754538 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-config\") pod \"route-controller-manager-7d79786988-7xh9m\" (UID: \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\") " pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.754606 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-serving-cert\") pod \"route-controller-manager-7d79786988-7xh9m\" (UID: \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\") " pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.754646 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hhzg\" (UniqueName: \"kubernetes.io/projected/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-kube-api-access-6hhzg\") pod \"route-controller-manager-7d79786988-7xh9m\" (UID: \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\") " pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.754767 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-config" (OuterVolumeSpecName: "config") pod "f1a25b7e-31ff-40e1-9aa0-07ef01b6333b" (UID: "f1a25b7e-31ff-40e1-9aa0-07ef01b6333b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.754837 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-client-ca" (OuterVolumeSpecName: "client-ca") pod "f1a25b7e-31ff-40e1-9aa0-07ef01b6333b" (UID: "f1a25b7e-31ff-40e1-9aa0-07ef01b6333b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.764043 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f1a25b7e-31ff-40e1-9aa0-07ef01b6333b" (UID: "f1a25b7e-31ff-40e1-9aa0-07ef01b6333b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.765276 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-kube-api-access-ngs2k" (OuterVolumeSpecName: "kube-api-access-ngs2k") pod "f1a25b7e-31ff-40e1-9aa0-07ef01b6333b" (UID: "f1a25b7e-31ff-40e1-9aa0-07ef01b6333b"). InnerVolumeSpecName "kube-api-access-ngs2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.855292 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-client-ca\") pod \"route-controller-manager-7d79786988-7xh9m\" (UID: \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\") " pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.855375 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-config\") pod \"route-controller-manager-7d79786988-7xh9m\" (UID: \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\") " pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.855400 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-serving-cert\") pod \"route-controller-manager-7d79786988-7xh9m\" (UID: \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\") " pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.855418 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hhzg\" (UniqueName: \"kubernetes.io/projected/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-kube-api-access-6hhzg\") pod \"route-controller-manager-7d79786988-7xh9m\" (UID: \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\") " pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.855469 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.855480 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngs2k\" (UniqueName: \"kubernetes.io/projected/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-kube-api-access-ngs2k\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.855490 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.855499 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.856951 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-config\") pod \"route-controller-manager-7d79786988-7xh9m\" (UID: \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\") " pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.857876 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-client-ca\") pod \"route-controller-manager-7d79786988-7xh9m\" (UID: \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\") " pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.862843 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-serving-cert\") pod \"route-controller-manager-7d79786988-7xh9m\" (UID: \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\") " pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.883605 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hhzg\" (UniqueName: \"kubernetes.io/projected/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-kube-api-access-6hhzg\") pod \"route-controller-manager-7d79786988-7xh9m\" (UID: \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\") " pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:40 crc kubenswrapper[4776]: I0128 06:51:40.966858 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:41 crc kubenswrapper[4776]: I0128 06:51:41.109833 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" event={"ID":"f1a25b7e-31ff-40e1-9aa0-07ef01b6333b","Type":"ContainerDied","Data":"fcbf74d9e1476fd5be6f8cc66a8680eb901e71291185a73fd22143e7229e1ab0"} Jan 28 06:51:41 crc kubenswrapper[4776]: I0128 06:51:41.109927 4776 scope.go:117] "RemoveContainer" containerID="521670f1b900680a57e51c2be6942d15b588053f6d0e2c6fe35212772d88f820" Jan 28 06:51:41 crc kubenswrapper[4776]: I0128 06:51:41.109936 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b" Jan 28 06:51:41 crc kubenswrapper[4776]: I0128 06:51:41.148104 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b"] Jan 28 06:51:41 crc kubenswrapper[4776]: I0128 06:51:41.151773 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zcm9b"] Jan 28 06:51:41 crc kubenswrapper[4776]: I0128 06:51:41.313848 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a25b7e-31ff-40e1-9aa0-07ef01b6333b" path="/var/lib/kubelet/pods/f1a25b7e-31ff-40e1-9aa0-07ef01b6333b/volumes" Jan 28 06:51:41 crc kubenswrapper[4776]: I0128 06:51:41.321819 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 28 06:51:42 crc kubenswrapper[4776]: E0128 06:51:42.529569 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 06:51:42 crc kubenswrapper[4776]: E0128 06:51:42.531081 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 06:51:42 crc kubenswrapper[4776]: E0128 06:51:42.533433 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 06:51:42 crc kubenswrapper[4776]: E0128 06:51:42.533470 4776 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" podUID="5a30af5d-0b53-4d54-af2a-2a4d5a296e6a" containerName="kube-multus-additional-cni-plugins" Jan 28 06:51:42 crc kubenswrapper[4776]: I0128 06:51:42.722651 4776 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9fmw6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 28 06:51:42 crc kubenswrapper[4776]: I0128 06:51:42.722810 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" podUID="be5bb707-a7a1-4b88-a75e-0093c14a4764" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 28 06:51:48 crc kubenswrapper[4776]: E0128 06:51:48.642206 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 28 06:51:48 crc kubenswrapper[4776]: E0128 06:51:48.642757 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rxb24,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-jvjv2_openshift-marketplace(beb166aa-d9c2-4658-af43-8d5d2eb61588): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 06:51:48 crc kubenswrapper[4776]: E0128 06:51:48.644748 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-jvjv2" podUID="beb166aa-d9c2-4658-af43-8d5d2eb61588" Jan 28 06:51:48 crc kubenswrapper[4776]: E0128 06:51:48.712678 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 28 06:51:48 crc kubenswrapper[4776]: E0128 06:51:48.713317 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2v6wp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-24xrp_openshift-marketplace(a3f85efd-d5e1-45c6-9fa4-211ef9b477b4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 06:51:48 crc kubenswrapper[4776]: E0128 06:51:48.714486 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-24xrp" podUID="a3f85efd-d5e1-45c6-9fa4-211ef9b477b4" Jan 28 06:51:49 crc kubenswrapper[4776]: I0128 06:51:49.166348 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-xlbwv_5a30af5d-0b53-4d54-af2a-2a4d5a296e6a/kube-multus-additional-cni-plugins/0.log" Jan 28 06:51:49 crc kubenswrapper[4776]: I0128 06:51:49.166425 4776 generic.go:334] "Generic (PLEG): container finished" podID="5a30af5d-0b53-4d54-af2a-2a4d5a296e6a" containerID="9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09" exitCode=137 Jan 28 06:51:49 crc kubenswrapper[4776]: I0128 06:51:49.166527 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" event={"ID":"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a","Type":"ContainerDied","Data":"9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09"} Jan 28 06:51:49 crc kubenswrapper[4776]: I0128 06:51:49.214183 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=8.214158043 podStartE2EDuration="8.214158043s" podCreationTimestamp="2026-01-28 06:51:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:49.211336397 +0000 UTC m=+80.626996567" watchObservedRunningTime="2026-01-28 06:51:49.214158043 +0000 UTC m=+80.629818203" Jan 28 06:51:49 crc kubenswrapper[4776]: E0128 06:51:49.378609 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-jvjv2" podUID="beb166aa-d9c2-4658-af43-8d5d2eb61588" Jan 28 06:51:49 crc kubenswrapper[4776]: E0128 06:51:49.380350 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-24xrp" podUID="a3f85efd-d5e1-45c6-9fa4-211ef9b477b4" Jan 28 06:51:49 crc kubenswrapper[4776]: E0128 06:51:49.551891 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 28 06:51:49 crc kubenswrapper[4776]: E0128 06:51:49.552480 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fzvrs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ndmtj_openshift-marketplace(d608fa02-5844-4167-831f-c754aeca5050): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 06:51:49 crc kubenswrapper[4776]: E0128 06:51:49.554121 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ndmtj" podUID="d608fa02-5844-4167-831f-c754aeca5050" Jan 28 06:51:52 crc kubenswrapper[4776]: E0128 06:51:52.528223 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09 is running failed: container process not found" containerID="9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 06:51:52 crc kubenswrapper[4776]: E0128 06:51:52.530462 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09 is running failed: container process not found" containerID="9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 06:51:52 crc kubenswrapper[4776]: E0128 06:51:52.530889 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09 is running failed: container process not found" containerID="9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 28 06:51:52 crc kubenswrapper[4776]: E0128 06:51:52.531001 4776 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" podUID="5a30af5d-0b53-4d54-af2a-2a4d5a296e6a" containerName="kube-multus-additional-cni-plugins" Jan 28 06:51:52 crc kubenswrapper[4776]: I0128 06:51:52.707505 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sznv4" Jan 28 06:51:52 crc kubenswrapper[4776]: I0128 06:51:52.718936 4776 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9fmw6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 28 06:51:52 crc kubenswrapper[4776]: I0128 06:51:52.719006 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" podUID="be5bb707-a7a1-4b88-a75e-0093c14a4764" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 28 06:51:53 crc kubenswrapper[4776]: E0128 06:51:53.239844 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ndmtj" podUID="d608fa02-5844-4167-831f-c754aeca5050" Jan 28 06:51:53 crc kubenswrapper[4776]: E0128 06:51:53.318594 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 28 06:51:53 crc kubenswrapper[4776]: E0128 06:51:53.318778 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-52zpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lmx8d_openshift-marketplace(40686731-ff76-403c-bbed-20ceaa786fbc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 06:51:53 crc kubenswrapper[4776]: E0128 06:51:53.319958 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-lmx8d" podUID="40686731-ff76-403c-bbed-20ceaa786fbc" Jan 28 06:51:53 crc kubenswrapper[4776]: I0128 06:51:53.611035 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m"] Jan 28 06:51:54 crc kubenswrapper[4776]: E0128 06:51:54.887759 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lmx8d" podUID="40686731-ff76-403c-bbed-20ceaa786fbc" Jan 28 06:51:54 crc kubenswrapper[4776]: E0128 06:51:54.955678 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 28 06:51:54 crc kubenswrapper[4776]: E0128 06:51:54.956350 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8d6nt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-2f2sb_openshift-marketplace(8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 06:51:54 crc kubenswrapper[4776]: E0128 06:51:54.957517 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-2f2sb" podUID="8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.009585 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.014786 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-xlbwv_5a30af5d-0b53-4d54-af2a-2a4d5a296e6a/kube-multus-additional-cni-plugins/0.log" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.014855 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:55 crc kubenswrapper[4776]: E0128 06:51:55.064166 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 28 06:51:55 crc kubenswrapper[4776]: E0128 06:51:55.064350 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ztnwv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5cqq4_openshift-marketplace(f691d5a6-7d36-4834-8844-ccd5c12b6645): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 06:51:55 crc kubenswrapper[4776]: E0128 06:51:55.066699 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5cqq4" podUID="f691d5a6-7d36-4834-8844-ccd5c12b6645" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.084796 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75d98599bf-2gk59"] Jan 28 06:51:55 crc kubenswrapper[4776]: E0128 06:51:55.085526 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5bb707-a7a1-4b88-a75e-0093c14a4764" containerName="controller-manager" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.085566 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5bb707-a7a1-4b88-a75e-0093c14a4764" containerName="controller-manager" Jan 28 06:51:55 crc kubenswrapper[4776]: E0128 06:51:55.085583 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a30af5d-0b53-4d54-af2a-2a4d5a296e6a" containerName="kube-multus-additional-cni-plugins" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.085591 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a30af5d-0b53-4d54-af2a-2a4d5a296e6a" containerName="kube-multus-additional-cni-plugins" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.085713 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5bb707-a7a1-4b88-a75e-0093c14a4764" containerName="controller-manager" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.085791 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a30af5d-0b53-4d54-af2a-2a4d5a296e6a" containerName="kube-multus-additional-cni-plugins" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.086106 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-config\") pod \"be5bb707-a7a1-4b88-a75e-0093c14a4764\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.086300 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q8s6\" (UniqueName: \"kubernetes.io/projected/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-kube-api-access-5q8s6\") pod \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\" (UID: \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\") " Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.086389 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-tuning-conf-dir\") pod \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\" (UID: \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\") " Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.086419 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clfbw\" (UniqueName: \"kubernetes.io/projected/be5bb707-a7a1-4b88-a75e-0093c14a4764-kube-api-access-clfbw\") pod \"be5bb707-a7a1-4b88-a75e-0093c14a4764\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.086439 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-client-ca\") pod \"be5bb707-a7a1-4b88-a75e-0093c14a4764\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.086491 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be5bb707-a7a1-4b88-a75e-0093c14a4764-serving-cert\") pod \"be5bb707-a7a1-4b88-a75e-0093c14a4764\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.086585 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-ready\") pod \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\" (UID: \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\") " Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.086658 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-cni-sysctl-allowlist\") pod \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\" (UID: \"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a\") " Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.086691 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-proxy-ca-bundles\") pod \"be5bb707-a7a1-4b88-a75e-0093c14a4764\" (UID: \"be5bb707-a7a1-4b88-a75e-0093c14a4764\") " Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.087684 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "5a30af5d-0b53-4d54-af2a-2a4d5a296e6a" (UID: "5a30af5d-0b53-4d54-af2a-2a4d5a296e6a"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.087797 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-client-ca" (OuterVolumeSpecName: "client-ca") pod "be5bb707-a7a1-4b88-a75e-0093c14a4764" (UID: "be5bb707-a7a1-4b88-a75e-0093c14a4764"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.087861 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-config" (OuterVolumeSpecName: "config") pod "be5bb707-a7a1-4b88-a75e-0093c14a4764" (UID: "be5bb707-a7a1-4b88-a75e-0093c14a4764"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.088078 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.088886 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-ready" (OuterVolumeSpecName: "ready") pod "5a30af5d-0b53-4d54-af2a-2a4d5a296e6a" (UID: "5a30af5d-0b53-4d54-af2a-2a4d5a296e6a"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.088936 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75d98599bf-2gk59"] Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.089366 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "be5bb707-a7a1-4b88-a75e-0093c14a4764" (UID: "be5bb707-a7a1-4b88-a75e-0093c14a4764"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.089816 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "5a30af5d-0b53-4d54-af2a-2a4d5a296e6a" (UID: "5a30af5d-0b53-4d54-af2a-2a4d5a296e6a"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.099461 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be5bb707-a7a1-4b88-a75e-0093c14a4764-kube-api-access-clfbw" (OuterVolumeSpecName: "kube-api-access-clfbw") pod "be5bb707-a7a1-4b88-a75e-0093c14a4764" (UID: "be5bb707-a7a1-4b88-a75e-0093c14a4764"). InnerVolumeSpecName "kube-api-access-clfbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.100436 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-kube-api-access-5q8s6" (OuterVolumeSpecName: "kube-api-access-5q8s6") pod "5a30af5d-0b53-4d54-af2a-2a4d5a296e6a" (UID: "5a30af5d-0b53-4d54-af2a-2a4d5a296e6a"). InnerVolumeSpecName "kube-api-access-5q8s6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:51:55 crc kubenswrapper[4776]: E0128 06:51:55.117939 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 28 06:51:55 crc kubenswrapper[4776]: E0128 06:51:55.118118 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-675ck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-482wz_openshift-marketplace(f79a920c-ccfe-464a-afe3-26d89327d4d9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.118798 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be5bb707-a7a1-4b88-a75e-0093c14a4764-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "be5bb707-a7a1-4b88-a75e-0093c14a4764" (UID: "be5bb707-a7a1-4b88-a75e-0093c14a4764"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:51:55 crc kubenswrapper[4776]: E0128 06:51:55.119483 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-482wz" podUID="f79a920c-ccfe-464a-afe3-26d89327d4d9" Jan 28 06:51:55 crc kubenswrapper[4776]: E0128 06:51:55.174446 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 28 06:51:55 crc kubenswrapper[4776]: E0128 06:51:55.174603 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8dhg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-8d9l4_openshift-marketplace(51afc3ef-b111-4228-859a-9ff98f2b5131): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 06:51:55 crc kubenswrapper[4776]: E0128 06:51:55.175874 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-8d9l4" podUID="51afc3ef-b111-4228-859a-9ff98f2b5131" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.188096 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-config\") pod \"controller-manager-75d98599bf-2gk59\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.188267 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-serving-cert\") pod \"controller-manager-75d98599bf-2gk59\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.188318 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-proxy-ca-bundles\") pod \"controller-manager-75d98599bf-2gk59\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.188514 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-client-ca\") pod \"controller-manager-75d98599bf-2gk59\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.188558 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-948pc\" (UniqueName: \"kubernetes.io/projected/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-kube-api-access-948pc\") pod \"controller-manager-75d98599bf-2gk59\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.188713 4776 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.188727 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.188741 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.188752 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q8s6\" (UniqueName: \"kubernetes.io/projected/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-kube-api-access-5q8s6\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.188762 4776 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.188772 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clfbw\" (UniqueName: \"kubernetes.io/projected/be5bb707-a7a1-4b88-a75e-0093c14a4764-kube-api-access-clfbw\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.188782 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be5bb707-a7a1-4b88-a75e-0093c14a4764-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.188792 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be5bb707-a7a1-4b88-a75e-0093c14a4764-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.188801 4776 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a-ready\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.206565 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m"] Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.220871 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-xlbwv_5a30af5d-0b53-4d54-af2a-2a4d5a296e6a/kube-multus-additional-cni-plugins/0.log" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.220961 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" event={"ID":"5a30af5d-0b53-4d54-af2a-2a4d5a296e6a","Type":"ContainerDied","Data":"d0b27cdc494f409d0213644148415452e415155acd88e7b7396d40c5e2104440"} Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.221010 4776 scope.go:117] "RemoveContainer" containerID="9670c75eadf39171b320c0f6d5400649ead8db572ecd5acd30a451b17e5e9c09" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.221116 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-xlbwv" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.231791 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" event={"ID":"be5bb707-a7a1-4b88-a75e-0093c14a4764","Type":"ContainerDied","Data":"ce4d259758def76a79a071f5bef5eab36501b29656f2d097f1e9b3c2c96d81c0"} Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.231902 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9fmw6" Jan 28 06:51:55 crc kubenswrapper[4776]: E0128 06:51:55.251149 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-482wz" podUID="f79a920c-ccfe-464a-afe3-26d89327d4d9" Jan 28 06:51:55 crc kubenswrapper[4776]: E0128 06:51:55.251578 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-8d9l4" podUID="51afc3ef-b111-4228-859a-9ff98f2b5131" Jan 28 06:51:55 crc kubenswrapper[4776]: E0128 06:51:55.251650 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5cqq4" podUID="f691d5a6-7d36-4834-8844-ccd5c12b6645" Jan 28 06:51:55 crc kubenswrapper[4776]: E0128 06:51:55.251750 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-2f2sb" podUID="8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.263389 4776 scope.go:117] "RemoveContainer" containerID="46dc529a83d924236bf0101d580a2a32093a154ee95f9a222019c276d5d7eb15" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.290261 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-proxy-ca-bundles\") pod \"controller-manager-75d98599bf-2gk59\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.290848 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-client-ca\") pod \"controller-manager-75d98599bf-2gk59\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.290873 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-948pc\" (UniqueName: \"kubernetes.io/projected/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-kube-api-access-948pc\") pod \"controller-manager-75d98599bf-2gk59\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.290936 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-config\") pod \"controller-manager-75d98599bf-2gk59\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.290992 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-serving-cert\") pod \"controller-manager-75d98599bf-2gk59\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.295336 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-config\") pod \"controller-manager-75d98599bf-2gk59\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.295737 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-client-ca\") pod \"controller-manager-75d98599bf-2gk59\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.297705 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-proxy-ca-bundles\") pod \"controller-manager-75d98599bf-2gk59\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.298213 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-serving-cert\") pod \"controller-manager-75d98599bf-2gk59\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.318955 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-948pc\" (UniqueName: \"kubernetes.io/projected/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-kube-api-access-948pc\") pod \"controller-manager-75d98599bf-2gk59\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.328864 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9fmw6"] Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.340316 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9fmw6"] Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.346027 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-xlbwv"] Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.349128 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-xlbwv"] Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.437287 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:55 crc kubenswrapper[4776]: I0128 06:51:55.890258 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75d98599bf-2gk59"] Jan 28 06:51:55 crc kubenswrapper[4776]: W0128 06:51:55.896778 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb90cf6f1_329a_4a94_bd6c_7b1cae105aea.slice/crio-bcef5f03ed8d2929481c4ecdf1497662eb7cc61f1f6fddfe27d21a460096d6f0 WatchSource:0}: Error finding container bcef5f03ed8d2929481c4ecdf1497662eb7cc61f1f6fddfe27d21a460096d6f0: Status 404 returned error can't find the container with id bcef5f03ed8d2929481c4ecdf1497662eb7cc61f1f6fddfe27d21a460096d6f0 Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.247206 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" event={"ID":"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9","Type":"ContainerStarted","Data":"ede89da186fc77c7c1df457543711f3adef417688e6a23612f340c111c5b767b"} Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.247298 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" event={"ID":"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9","Type":"ContainerStarted","Data":"906dba4e6f1e9a2b79e566d946f312653b8271c72349d8e432a91726c06c1ebb"} Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.247332 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.247855 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" podUID="e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9" containerName="route-controller-manager" containerID="cri-o://ede89da186fc77c7c1df457543711f3adef417688e6a23612f340c111c5b767b" gracePeriod=30 Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.254072 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" event={"ID":"b90cf6f1-329a-4a94-bd6c-7b1cae105aea","Type":"ContainerStarted","Data":"c12173d53338f3b5f503fa5a1ebed6763fee73421f1ac8e4543ea9769ec3ea8d"} Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.254136 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" event={"ID":"b90cf6f1-329a-4a94-bd6c-7b1cae105aea","Type":"ContainerStarted","Data":"bcef5f03ed8d2929481c4ecdf1497662eb7cc61f1f6fddfe27d21a460096d6f0"} Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.254513 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.257195 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.272727 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.294722 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" podStartSLOduration=23.294694097 podStartE2EDuration="23.294694097s" podCreationTimestamp="2026-01-28 06:51:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:56.292175849 +0000 UTC m=+87.707836009" watchObservedRunningTime="2026-01-28 06:51:56.294694097 +0000 UTC m=+87.710354257" Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.355045 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" podStartSLOduration=3.355016951 podStartE2EDuration="3.355016951s" podCreationTimestamp="2026-01-28 06:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:56.340536773 +0000 UTC m=+87.756196943" watchObservedRunningTime="2026-01-28 06:51:56.355016951 +0000 UTC m=+87.770677111" Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.716227 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.912412 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-serving-cert\") pod \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\" (UID: \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\") " Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.912489 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-config\") pod \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\" (UID: \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\") " Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.912541 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hhzg\" (UniqueName: \"kubernetes.io/projected/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-kube-api-access-6hhzg\") pod \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\" (UID: \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\") " Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.912638 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-client-ca\") pod \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\" (UID: \"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9\") " Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.914048 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-client-ca" (OuterVolumeSpecName: "client-ca") pod "e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9" (UID: "e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.914445 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-config" (OuterVolumeSpecName: "config") pod "e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9" (UID: "e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.920734 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-kube-api-access-6hhzg" (OuterVolumeSpecName: "kube-api-access-6hhzg") pod "e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9" (UID: "e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9"). InnerVolumeSpecName "kube-api-access-6hhzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:51:56 crc kubenswrapper[4776]: I0128 06:51:56.923324 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9" (UID: "e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.014183 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.014230 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.014248 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hhzg\" (UniqueName: \"kubernetes.io/projected/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-kube-api-access-6hhzg\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.014266 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.268629 4776 generic.go:334] "Generic (PLEG): container finished" podID="e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9" containerID="ede89da186fc77c7c1df457543711f3adef417688e6a23612f340c111c5b767b" exitCode=0 Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.268682 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" event={"ID":"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9","Type":"ContainerDied","Data":"ede89da186fc77c7c1df457543711f3adef417688e6a23612f340c111c5b767b"} Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.268741 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" event={"ID":"e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9","Type":"ContainerDied","Data":"906dba4e6f1e9a2b79e566d946f312653b8271c72349d8e432a91726c06c1ebb"} Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.268773 4776 scope.go:117] "RemoveContainer" containerID="ede89da186fc77c7c1df457543711f3adef417688e6a23612f340c111c5b767b" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.268967 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.293712 4776 scope.go:117] "RemoveContainer" containerID="ede89da186fc77c7c1df457543711f3adef417688e6a23612f340c111c5b767b" Jan 28 06:51:57 crc kubenswrapper[4776]: E0128 06:51:57.295028 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ede89da186fc77c7c1df457543711f3adef417688e6a23612f340c111c5b767b\": container with ID starting with ede89da186fc77c7c1df457543711f3adef417688e6a23612f340c111c5b767b not found: ID does not exist" containerID="ede89da186fc77c7c1df457543711f3adef417688e6a23612f340c111c5b767b" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.295091 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ede89da186fc77c7c1df457543711f3adef417688e6a23612f340c111c5b767b"} err="failed to get container status \"ede89da186fc77c7c1df457543711f3adef417688e6a23612f340c111c5b767b\": rpc error: code = NotFound desc = could not find container \"ede89da186fc77c7c1df457543711f3adef417688e6a23612f340c111c5b767b\": container with ID starting with ede89da186fc77c7c1df457543711f3adef417688e6a23612f340c111c5b767b not found: ID does not exist" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.313391 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a30af5d-0b53-4d54-af2a-2a4d5a296e6a" path="/var/lib/kubelet/pods/5a30af5d-0b53-4d54-af2a-2a4d5a296e6a/volumes" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.314678 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be5bb707-a7a1-4b88-a75e-0093c14a4764" path="/var/lib/kubelet/pods/be5bb707-a7a1-4b88-a75e-0093c14a4764/volumes" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.315315 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m"] Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.315370 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d79786988-7xh9m"] Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.928155 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-954f97495-wtdns"] Jan 28 06:51:57 crc kubenswrapper[4776]: E0128 06:51:57.929218 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9" containerName="route-controller-manager" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.929251 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9" containerName="route-controller-manager" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.929447 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9" containerName="route-controller-manager" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.930194 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.933280 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.933870 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.936685 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.936697 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.936898 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.937000 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 28 06:51:57 crc kubenswrapper[4776]: I0128 06:51:57.973997 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-954f97495-wtdns"] Jan 28 06:51:58 crc kubenswrapper[4776]: I0128 06:51:58.130665 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksxqw\" (UniqueName: \"kubernetes.io/projected/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-kube-api-access-ksxqw\") pod \"route-controller-manager-954f97495-wtdns\" (UID: \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\") " pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:51:58 crc kubenswrapper[4776]: I0128 06:51:58.130757 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-config\") pod \"route-controller-manager-954f97495-wtdns\" (UID: \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\") " pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:51:58 crc kubenswrapper[4776]: I0128 06:51:58.131130 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-client-ca\") pod \"route-controller-manager-954f97495-wtdns\" (UID: \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\") " pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:51:58 crc kubenswrapper[4776]: I0128 06:51:58.131272 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-serving-cert\") pod \"route-controller-manager-954f97495-wtdns\" (UID: \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\") " pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:51:58 crc kubenswrapper[4776]: I0128 06:51:58.232658 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksxqw\" (UniqueName: \"kubernetes.io/projected/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-kube-api-access-ksxqw\") pod \"route-controller-manager-954f97495-wtdns\" (UID: \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\") " pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:51:58 crc kubenswrapper[4776]: I0128 06:51:58.233114 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-config\") pod \"route-controller-manager-954f97495-wtdns\" (UID: \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\") " pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:51:58 crc kubenswrapper[4776]: I0128 06:51:58.233234 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-client-ca\") pod \"route-controller-manager-954f97495-wtdns\" (UID: \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\") " pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:51:58 crc kubenswrapper[4776]: I0128 06:51:58.233347 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-serving-cert\") pod \"route-controller-manager-954f97495-wtdns\" (UID: \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\") " pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:51:58 crc kubenswrapper[4776]: I0128 06:51:58.234695 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-config\") pod \"route-controller-manager-954f97495-wtdns\" (UID: \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\") " pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:51:58 crc kubenswrapper[4776]: I0128 06:51:58.234734 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-client-ca\") pod \"route-controller-manager-954f97495-wtdns\" (UID: \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\") " pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:51:58 crc kubenswrapper[4776]: I0128 06:51:58.243834 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-serving-cert\") pod \"route-controller-manager-954f97495-wtdns\" (UID: \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\") " pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:51:58 crc kubenswrapper[4776]: I0128 06:51:58.257161 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksxqw\" (UniqueName: \"kubernetes.io/projected/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-kube-api-access-ksxqw\") pod \"route-controller-manager-954f97495-wtdns\" (UID: \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\") " pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:51:58 crc kubenswrapper[4776]: I0128 06:51:58.268901 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:51:58 crc kubenswrapper[4776]: I0128 06:51:58.670352 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-954f97495-wtdns"] Jan 28 06:51:59 crc kubenswrapper[4776]: I0128 06:51:59.288129 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" event={"ID":"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb","Type":"ContainerStarted","Data":"cc9cb8bad264b1c1fbccb72bcbaeb10a4327cca4d625be33ee8bd2d21ad66ca8"} Jan 28 06:51:59 crc kubenswrapper[4776]: I0128 06:51:59.288603 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" event={"ID":"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb","Type":"ContainerStarted","Data":"85a570788cf993f79771c573d798d927813fc4e253a674a0dbe71b536fd4e8fe"} Jan 28 06:51:59 crc kubenswrapper[4776]: I0128 06:51:59.288629 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:51:59 crc kubenswrapper[4776]: I0128 06:51:59.294345 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:51:59 crc kubenswrapper[4776]: I0128 06:51:59.308827 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" podStartSLOduration=6.308810837 podStartE2EDuration="6.308810837s" podCreationTimestamp="2026-01-28 06:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:51:59.308278333 +0000 UTC m=+90.723938503" watchObservedRunningTime="2026-01-28 06:51:59.308810837 +0000 UTC m=+90.724470987" Jan 28 06:51:59 crc kubenswrapper[4776]: I0128 06:51:59.313377 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9" path="/var/lib/kubelet/pods/e0ee9aa2-1685-4a2f-8df2-b5b25d0939d9/volumes" Jan 28 06:52:00 crc kubenswrapper[4776]: I0128 06:52:00.840903 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 06:52:01 crc kubenswrapper[4776]: I0128 06:52:01.671702 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 28 06:52:01 crc kubenswrapper[4776]: I0128 06:52:01.673088 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 06:52:01 crc kubenswrapper[4776]: I0128 06:52:01.673227 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 28 06:52:01 crc kubenswrapper[4776]: I0128 06:52:01.677475 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 28 06:52:01 crc kubenswrapper[4776]: I0128 06:52:01.677898 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 28 06:52:01 crc kubenswrapper[4776]: I0128 06:52:01.685585 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20dc5e3d-9de7-442b-a460-16b5f4e875df-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"20dc5e3d-9de7-442b-a460-16b5f4e875df\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 06:52:01 crc kubenswrapper[4776]: I0128 06:52:01.685668 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20dc5e3d-9de7-442b-a460-16b5f4e875df-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"20dc5e3d-9de7-442b-a460-16b5f4e875df\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 06:52:01 crc kubenswrapper[4776]: I0128 06:52:01.786956 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20dc5e3d-9de7-442b-a460-16b5f4e875df-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"20dc5e3d-9de7-442b-a460-16b5f4e875df\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 06:52:01 crc kubenswrapper[4776]: I0128 06:52:01.787643 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20dc5e3d-9de7-442b-a460-16b5f4e875df-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"20dc5e3d-9de7-442b-a460-16b5f4e875df\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 06:52:01 crc kubenswrapper[4776]: I0128 06:52:01.787185 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20dc5e3d-9de7-442b-a460-16b5f4e875df-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"20dc5e3d-9de7-442b-a460-16b5f4e875df\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 06:52:01 crc kubenswrapper[4776]: I0128 06:52:01.810034 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20dc5e3d-9de7-442b-a460-16b5f4e875df-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"20dc5e3d-9de7-442b-a460-16b5f4e875df\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 06:52:01 crc kubenswrapper[4776]: I0128 06:52:01.989450 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 06:52:02 crc kubenswrapper[4776]: I0128 06:52:02.309073 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24xrp" event={"ID":"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4","Type":"ContainerDied","Data":"9e1e1229d1b0e27e73fa462a4ec6176fb0a87ab9ddc74f08a14570bd02cde323"} Jan 28 06:52:02 crc kubenswrapper[4776]: I0128 06:52:02.309517 4776 generic.go:334] "Generic (PLEG): container finished" podID="a3f85efd-d5e1-45c6-9fa4-211ef9b477b4" containerID="9e1e1229d1b0e27e73fa462a4ec6176fb0a87ab9ddc74f08a14570bd02cde323" exitCode=0 Jan 28 06:52:02 crc kubenswrapper[4776]: I0128 06:52:02.409792 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 28 06:52:02 crc kubenswrapper[4776]: W0128 06:52:02.721826 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod20dc5e3d_9de7_442b_a460_16b5f4e875df.slice/crio-205c2ec4186110e1c065e4b6b1f124b70dc82c8b08a5e15b8dacaabfa80f9fa0 WatchSource:0}: Error finding container 205c2ec4186110e1c065e4b6b1f124b70dc82c8b08a5e15b8dacaabfa80f9fa0: Status 404 returned error can't find the container with id 205c2ec4186110e1c065e4b6b1f124b70dc82c8b08a5e15b8dacaabfa80f9fa0 Jan 28 06:52:03 crc kubenswrapper[4776]: I0128 06:52:03.334688 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24xrp" event={"ID":"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4","Type":"ContainerStarted","Data":"16000ac6adf8c72e4096d8ba31b1f35e2a2ea833a4dcc6361376e833b66f200c"} Jan 28 06:52:03 crc kubenswrapper[4776]: I0128 06:52:03.337799 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"20dc5e3d-9de7-442b-a460-16b5f4e875df","Type":"ContainerStarted","Data":"c2ea62c826cc91a55de488c845cef9fd6ddbc757e05d4977b4c199b15a50e1fb"} Jan 28 06:52:03 crc kubenswrapper[4776]: I0128 06:52:03.337841 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"20dc5e3d-9de7-442b-a460-16b5f4e875df","Type":"ContainerStarted","Data":"205c2ec4186110e1c065e4b6b1f124b70dc82c8b08a5e15b8dacaabfa80f9fa0"} Jan 28 06:52:03 crc kubenswrapper[4776]: I0128 06:52:03.360490 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-24xrp" podStartSLOduration=2.325079742 podStartE2EDuration="44.360462806s" podCreationTimestamp="2026-01-28 06:51:19 +0000 UTC" firstStartedPulling="2026-01-28 06:51:20.732016686 +0000 UTC m=+52.147676846" lastFinishedPulling="2026-01-28 06:52:02.76739975 +0000 UTC m=+94.183059910" observedRunningTime="2026-01-28 06:52:03.35872972 +0000 UTC m=+94.774389880" watchObservedRunningTime="2026-01-28 06:52:03.360462806 +0000 UTC m=+94.776122966" Jan 28 06:52:03 crc kubenswrapper[4776]: I0128 06:52:03.378453 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.378433057 podStartE2EDuration="2.378433057s" podCreationTimestamp="2026-01-28 06:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:52:03.374240674 +0000 UTC m=+94.789900834" watchObservedRunningTime="2026-01-28 06:52:03.378433057 +0000 UTC m=+94.794093217" Jan 28 06:52:04 crc kubenswrapper[4776]: I0128 06:52:04.348529 4776 generic.go:334] "Generic (PLEG): container finished" podID="20dc5e3d-9de7-442b-a460-16b5f4e875df" containerID="c2ea62c826cc91a55de488c845cef9fd6ddbc757e05d4977b4c199b15a50e1fb" exitCode=0 Jan 28 06:52:04 crc kubenswrapper[4776]: I0128 06:52:04.348598 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"20dc5e3d-9de7-442b-a460-16b5f4e875df","Type":"ContainerDied","Data":"c2ea62c826cc91a55de488c845cef9fd6ddbc757e05d4977b4c199b15a50e1fb"} Jan 28 06:52:04 crc kubenswrapper[4776]: I0128 06:52:04.351813 4776 generic.go:334] "Generic (PLEG): container finished" podID="beb166aa-d9c2-4658-af43-8d5d2eb61588" containerID="063124a05d4923bd110ec2a4d146263e532ff1664e593b4825968f0c8854f132" exitCode=0 Jan 28 06:52:04 crc kubenswrapper[4776]: I0128 06:52:04.351864 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvjv2" event={"ID":"beb166aa-d9c2-4658-af43-8d5d2eb61588","Type":"ContainerDied","Data":"063124a05d4923bd110ec2a4d146263e532ff1664e593b4825968f0c8854f132"} Jan 28 06:52:05 crc kubenswrapper[4776]: I0128 06:52:05.361639 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvjv2" event={"ID":"beb166aa-d9c2-4658-af43-8d5d2eb61588","Type":"ContainerStarted","Data":"d6e9bf6730b69ed4e4b55b84540bbf5508c9e5dad1f2e54fc1bd296178a461b8"} Jan 28 06:52:05 crc kubenswrapper[4776]: I0128 06:52:05.387896 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jvjv2" podStartSLOduration=2.404980645 podStartE2EDuration="46.387870098s" podCreationTimestamp="2026-01-28 06:51:19 +0000 UTC" firstStartedPulling="2026-01-28 06:51:20.761586651 +0000 UTC m=+52.177246811" lastFinishedPulling="2026-01-28 06:52:04.744476094 +0000 UTC m=+96.160136264" observedRunningTime="2026-01-28 06:52:05.385755621 +0000 UTC m=+96.801415781" watchObservedRunningTime="2026-01-28 06:52:05.387870098 +0000 UTC m=+96.803530258" Jan 28 06:52:05 crc kubenswrapper[4776]: I0128 06:52:05.788167 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 06:52:05 crc kubenswrapper[4776]: I0128 06:52:05.870660 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20dc5e3d-9de7-442b-a460-16b5f4e875df-kube-api-access\") pod \"20dc5e3d-9de7-442b-a460-16b5f4e875df\" (UID: \"20dc5e3d-9de7-442b-a460-16b5f4e875df\") " Jan 28 06:52:05 crc kubenswrapper[4776]: I0128 06:52:05.870904 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20dc5e3d-9de7-442b-a460-16b5f4e875df-kubelet-dir\") pod \"20dc5e3d-9de7-442b-a460-16b5f4e875df\" (UID: \"20dc5e3d-9de7-442b-a460-16b5f4e875df\") " Jan 28 06:52:05 crc kubenswrapper[4776]: I0128 06:52:05.871239 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20dc5e3d-9de7-442b-a460-16b5f4e875df-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "20dc5e3d-9de7-442b-a460-16b5f4e875df" (UID: "20dc5e3d-9de7-442b-a460-16b5f4e875df"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 06:52:05 crc kubenswrapper[4776]: I0128 06:52:05.878963 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20dc5e3d-9de7-442b-a460-16b5f4e875df-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "20dc5e3d-9de7-442b-a460-16b5f4e875df" (UID: "20dc5e3d-9de7-442b-a460-16b5f4e875df"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:52:05 crc kubenswrapper[4776]: I0128 06:52:05.972296 4776 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20dc5e3d-9de7-442b-a460-16b5f4e875df-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:05 crc kubenswrapper[4776]: I0128 06:52:05.972368 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20dc5e3d-9de7-442b-a460-16b5f4e875df-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:06 crc kubenswrapper[4776]: I0128 06:52:06.370710 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"20dc5e3d-9de7-442b-a460-16b5f4e875df","Type":"ContainerDied","Data":"205c2ec4186110e1c065e4b6b1f124b70dc82c8b08a5e15b8dacaabfa80f9fa0"} Jan 28 06:52:06 crc kubenswrapper[4776]: I0128 06:52:06.370762 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="205c2ec4186110e1c065e4b6b1f124b70dc82c8b08a5e15b8dacaabfa80f9fa0" Jan 28 06:52:06 crc kubenswrapper[4776]: I0128 06:52:06.370839 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.062762 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 28 06:52:07 crc kubenswrapper[4776]: E0128 06:52:07.063323 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20dc5e3d-9de7-442b-a460-16b5f4e875df" containerName="pruner" Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.063336 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="20dc5e3d-9de7-442b-a460-16b5f4e875df" containerName="pruner" Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.063442 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="20dc5e3d-9de7-442b-a460-16b5f4e875df" containerName="pruner" Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.063818 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.067390 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.067687 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.077106 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.091971 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/95d1a963-b856-4c93-b148-c90d7fd98582-kubelet-dir\") pod \"installer-9-crc\" (UID: \"95d1a963-b856-4c93-b148-c90d7fd98582\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.092070 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/95d1a963-b856-4c93-b148-c90d7fd98582-var-lock\") pod \"installer-9-crc\" (UID: \"95d1a963-b856-4c93-b148-c90d7fd98582\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.092103 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95d1a963-b856-4c93-b148-c90d7fd98582-kube-api-access\") pod \"installer-9-crc\" (UID: \"95d1a963-b856-4c93-b148-c90d7fd98582\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.193984 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/95d1a963-b856-4c93-b148-c90d7fd98582-kubelet-dir\") pod \"installer-9-crc\" (UID: \"95d1a963-b856-4c93-b148-c90d7fd98582\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.194117 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/95d1a963-b856-4c93-b148-c90d7fd98582-var-lock\") pod \"installer-9-crc\" (UID: \"95d1a963-b856-4c93-b148-c90d7fd98582\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.194143 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/95d1a963-b856-4c93-b148-c90d7fd98582-kubelet-dir\") pod \"installer-9-crc\" (UID: \"95d1a963-b856-4c93-b148-c90d7fd98582\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.194157 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95d1a963-b856-4c93-b148-c90d7fd98582-kube-api-access\") pod \"installer-9-crc\" (UID: \"95d1a963-b856-4c93-b148-c90d7fd98582\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.194204 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/95d1a963-b856-4c93-b148-c90d7fd98582-var-lock\") pod \"installer-9-crc\" (UID: \"95d1a963-b856-4c93-b148-c90d7fd98582\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.215714 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95d1a963-b856-4c93-b148-c90d7fd98582-kube-api-access\") pod \"installer-9-crc\" (UID: \"95d1a963-b856-4c93-b148-c90d7fd98582\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.381521 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cqq4" event={"ID":"f691d5a6-7d36-4834-8844-ccd5c12b6645","Type":"ContainerStarted","Data":"d7423d1ee77d981f9e4aa75c927e61f1506ccec88d0d9cb4d11f8dbbd8dd3e4c"} Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.383954 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.386038 4776 generic.go:334] "Generic (PLEG): container finished" podID="d608fa02-5844-4167-831f-c754aeca5050" containerID="74bc40e8465ae3813ea6d030c0547320cddf2e1fb91d5ce5185e0df80b95f502" exitCode=0 Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.386104 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndmtj" event={"ID":"d608fa02-5844-4167-831f-c754aeca5050","Type":"ContainerDied","Data":"74bc40e8465ae3813ea6d030c0547320cddf2e1fb91d5ce5185e0df80b95f502"} Jan 28 06:52:07 crc kubenswrapper[4776]: I0128 06:52:07.823863 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 28 06:52:08 crc kubenswrapper[4776]: I0128 06:52:08.401326 4776 generic.go:334] "Generic (PLEG): container finished" podID="f691d5a6-7d36-4834-8844-ccd5c12b6645" containerID="d7423d1ee77d981f9e4aa75c927e61f1506ccec88d0d9cb4d11f8dbbd8dd3e4c" exitCode=0 Jan 28 06:52:08 crc kubenswrapper[4776]: I0128 06:52:08.401719 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cqq4" event={"ID":"f691d5a6-7d36-4834-8844-ccd5c12b6645","Type":"ContainerDied","Data":"d7423d1ee77d981f9e4aa75c927e61f1506ccec88d0d9cb4d11f8dbbd8dd3e4c"} Jan 28 06:52:08 crc kubenswrapper[4776]: I0128 06:52:08.406601 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndmtj" event={"ID":"d608fa02-5844-4167-831f-c754aeca5050","Type":"ContainerStarted","Data":"ab17f2ecf7bb969254e3c6b2d209cd1bdbaa470323f12fb58a092f3060c9b32b"} Jan 28 06:52:08 crc kubenswrapper[4776]: I0128 06:52:08.409767 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"95d1a963-b856-4c93-b148-c90d7fd98582","Type":"ContainerStarted","Data":"d766d9b821f0459a36cadb5c984085744f95c90dd6f2930cd70892bab189732f"} Jan 28 06:52:08 crc kubenswrapper[4776]: I0128 06:52:08.409812 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"95d1a963-b856-4c93-b148-c90d7fd98582","Type":"ContainerStarted","Data":"e3324534f826734f17b1d5bb3069fdfd9ae3d7a79cf6ae6990033781bfa47213"} Jan 28 06:52:08 crc kubenswrapper[4776]: I0128 06:52:08.448145 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.448123262 podStartE2EDuration="1.448123262s" podCreationTimestamp="2026-01-28 06:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:52:08.445655707 +0000 UTC m=+99.861315887" watchObservedRunningTime="2026-01-28 06:52:08.448123262 +0000 UTC m=+99.863783422" Jan 28 06:52:08 crc kubenswrapper[4776]: I0128 06:52:08.470249 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ndmtj" podStartSLOduration=2.4681704890000002 podStartE2EDuration="47.470216893s" podCreationTimestamp="2026-01-28 06:51:21 +0000 UTC" firstStartedPulling="2026-01-28 06:51:22.835433172 +0000 UTC m=+54.251093332" lastFinishedPulling="2026-01-28 06:52:07.837479566 +0000 UTC m=+99.253139736" observedRunningTime="2026-01-28 06:52:08.466618677 +0000 UTC m=+99.882278827" watchObservedRunningTime="2026-01-28 06:52:08.470216893 +0000 UTC m=+99.885877053" Jan 28 06:52:09 crc kubenswrapper[4776]: I0128 06:52:09.372329 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jvjv2" Jan 28 06:52:09 crc kubenswrapper[4776]: I0128 06:52:09.372819 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jvjv2" Jan 28 06:52:09 crc kubenswrapper[4776]: I0128 06:52:09.418333 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cqq4" event={"ID":"f691d5a6-7d36-4834-8844-ccd5c12b6645","Type":"ContainerStarted","Data":"505b518df963d39877592c1bca3de5245d566d3029c91dfbfe05f0bde80200ae"} Jan 28 06:52:09 crc kubenswrapper[4776]: I0128 06:52:09.422810 4776 generic.go:334] "Generic (PLEG): container finished" podID="f79a920c-ccfe-464a-afe3-26d89327d4d9" containerID="4efef5ab3f72369911fbfd7d0882ab0b7c842b7ba2c8587899da583a578ec22d" exitCode=0 Jan 28 06:52:09 crc kubenswrapper[4776]: I0128 06:52:09.422898 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-482wz" event={"ID":"f79a920c-ccfe-464a-afe3-26d89327d4d9","Type":"ContainerDied","Data":"4efef5ab3f72369911fbfd7d0882ab0b7c842b7ba2c8587899da583a578ec22d"} Jan 28 06:52:09 crc kubenswrapper[4776]: I0128 06:52:09.442216 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5cqq4" podStartSLOduration=2.163214576 podStartE2EDuration="50.442199668s" podCreationTimestamp="2026-01-28 06:51:19 +0000 UTC" firstStartedPulling="2026-01-28 06:51:20.752633282 +0000 UTC m=+52.168293442" lastFinishedPulling="2026-01-28 06:52:09.031618364 +0000 UTC m=+100.447278534" observedRunningTime="2026-01-28 06:52:09.440725539 +0000 UTC m=+100.856385699" watchObservedRunningTime="2026-01-28 06:52:09.442199668 +0000 UTC m=+100.857859828" Jan 28 06:52:09 crc kubenswrapper[4776]: I0128 06:52:09.532079 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jvjv2" Jan 28 06:52:09 crc kubenswrapper[4776]: I0128 06:52:09.553071 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5cqq4" Jan 28 06:52:09 crc kubenswrapper[4776]: I0128 06:52:09.553116 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5cqq4" Jan 28 06:52:09 crc kubenswrapper[4776]: I0128 06:52:09.579358 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jvjv2" Jan 28 06:52:09 crc kubenswrapper[4776]: I0128 06:52:09.769486 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-24xrp" Jan 28 06:52:09 crc kubenswrapper[4776]: I0128 06:52:09.769734 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-24xrp" Jan 28 06:52:09 crc kubenswrapper[4776]: I0128 06:52:09.821849 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-24xrp" Jan 28 06:52:10 crc kubenswrapper[4776]: I0128 06:52:10.436521 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-482wz" event={"ID":"f79a920c-ccfe-464a-afe3-26d89327d4d9","Type":"ContainerStarted","Data":"5c3e9066279d3b32527b32de586e8d3f2471b02104d389232979717c191bc2aa"} Jan 28 06:52:10 crc kubenswrapper[4776]: I0128 06:52:10.468503 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-482wz" podStartSLOduration=2.478070549 podStartE2EDuration="49.468486406s" podCreationTimestamp="2026-01-28 06:51:21 +0000 UTC" firstStartedPulling="2026-01-28 06:51:22.838257951 +0000 UTC m=+54.253918111" lastFinishedPulling="2026-01-28 06:52:09.828673808 +0000 UTC m=+101.244333968" observedRunningTime="2026-01-28 06:52:10.465905167 +0000 UTC m=+101.881565327" watchObservedRunningTime="2026-01-28 06:52:10.468486406 +0000 UTC m=+101.884146566" Jan 28 06:52:10 crc kubenswrapper[4776]: I0128 06:52:10.490045 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-24xrp" Jan 28 06:52:10 crc kubenswrapper[4776]: I0128 06:52:10.595124 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-5cqq4" podUID="f691d5a6-7d36-4834-8844-ccd5c12b6645" containerName="registry-server" probeResult="failure" output=< Jan 28 06:52:10 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Jan 28 06:52:10 crc kubenswrapper[4776]: > Jan 28 06:52:11 crc kubenswrapper[4776]: I0128 06:52:11.326854 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 28 06:52:11 crc kubenswrapper[4776]: I0128 06:52:11.366773 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ndmtj" Jan 28 06:52:11 crc kubenswrapper[4776]: I0128 06:52:11.367937 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ndmtj" Jan 28 06:52:11 crc kubenswrapper[4776]: I0128 06:52:11.423745 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ndmtj" Jan 28 06:52:11 crc kubenswrapper[4776]: I0128 06:52:11.456155 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=0.456123829 podStartE2EDuration="456.123829ms" podCreationTimestamp="2026-01-28 06:52:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:52:11.452272136 +0000 UTC m=+102.867932296" watchObservedRunningTime="2026-01-28 06:52:11.456123829 +0000 UTC m=+102.871783989" Jan 28 06:52:11 crc kubenswrapper[4776]: I0128 06:52:11.762200 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-482wz" Jan 28 06:52:11 crc kubenswrapper[4776]: I0128 06:52:11.762295 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-482wz" Jan 28 06:52:11 crc kubenswrapper[4776]: I0128 06:52:11.805512 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-482wz" Jan 28 06:52:13 crc kubenswrapper[4776]: I0128 06:52:13.549524 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-24xrp"] Jan 28 06:52:13 crc kubenswrapper[4776]: I0128 06:52:13.549879 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-24xrp" podUID="a3f85efd-d5e1-45c6-9fa4-211ef9b477b4" containerName="registry-server" containerID="cri-o://16000ac6adf8c72e4096d8ba31b1f35e2a2ea833a4dcc6361376e833b66f200c" gracePeriod=2 Jan 28 06:52:13 crc kubenswrapper[4776]: I0128 06:52:13.628290 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75d98599bf-2gk59"] Jan 28 06:52:13 crc kubenswrapper[4776]: I0128 06:52:13.629003 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" podUID="b90cf6f1-329a-4a94-bd6c-7b1cae105aea" containerName="controller-manager" containerID="cri-o://c12173d53338f3b5f503fa5a1ebed6763fee73421f1ac8e4543ea9769ec3ea8d" gracePeriod=30 Jan 28 06:52:13 crc kubenswrapper[4776]: I0128 06:52:13.734258 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-954f97495-wtdns"] Jan 28 06:52:13 crc kubenswrapper[4776]: I0128 06:52:13.734651 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" podUID="87ff0af1-a598-4872-8c0d-c8ec0a83a6bb" containerName="route-controller-manager" containerID="cri-o://cc9cb8bad264b1c1fbccb72bcbaeb10a4327cca4d625be33ee8bd2d21ad66ca8" gracePeriod=30 Jan 28 06:52:14 crc kubenswrapper[4776]: I0128 06:52:14.460775 4776 generic.go:334] "Generic (PLEG): container finished" podID="87ff0af1-a598-4872-8c0d-c8ec0a83a6bb" containerID="cc9cb8bad264b1c1fbccb72bcbaeb10a4327cca4d625be33ee8bd2d21ad66ca8" exitCode=0 Jan 28 06:52:14 crc kubenswrapper[4776]: I0128 06:52:14.460878 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" event={"ID":"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb","Type":"ContainerDied","Data":"cc9cb8bad264b1c1fbccb72bcbaeb10a4327cca4d625be33ee8bd2d21ad66ca8"} Jan 28 06:52:14 crc kubenswrapper[4776]: I0128 06:52:14.463092 4776 generic.go:334] "Generic (PLEG): container finished" podID="a3f85efd-d5e1-45c6-9fa4-211ef9b477b4" containerID="16000ac6adf8c72e4096d8ba31b1f35e2a2ea833a4dcc6361376e833b66f200c" exitCode=0 Jan 28 06:52:14 crc kubenswrapper[4776]: I0128 06:52:14.463158 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24xrp" event={"ID":"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4","Type":"ContainerDied","Data":"16000ac6adf8c72e4096d8ba31b1f35e2a2ea833a4dcc6361376e833b66f200c"} Jan 28 06:52:14 crc kubenswrapper[4776]: I0128 06:52:14.464339 4776 generic.go:334] "Generic (PLEG): container finished" podID="b90cf6f1-329a-4a94-bd6c-7b1cae105aea" containerID="c12173d53338f3b5f503fa5a1ebed6763fee73421f1ac8e4543ea9769ec3ea8d" exitCode=0 Jan 28 06:52:14 crc kubenswrapper[4776]: I0128 06:52:14.464429 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" event={"ID":"b90cf6f1-329a-4a94-bd6c-7b1cae105aea","Type":"ContainerDied","Data":"c12173d53338f3b5f503fa5a1ebed6763fee73421f1ac8e4543ea9769ec3ea8d"} Jan 28 06:52:15 crc kubenswrapper[4776]: I0128 06:52:15.438200 4776 patch_prober.go:28] interesting pod/controller-manager-75d98599bf-2gk59 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Jan 28 06:52:15 crc kubenswrapper[4776]: I0128 06:52:15.439038 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" podUID="b90cf6f1-329a-4a94-bd6c-7b1cae105aea" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.709303 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24xrp" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.758306 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-catalog-content\") pod \"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4\" (UID: \"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4\") " Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.758382 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v6wp\" (UniqueName: \"kubernetes.io/projected/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-kube-api-access-2v6wp\") pod \"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4\" (UID: \"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4\") " Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.758476 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-utilities\") pod \"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4\" (UID: \"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4\") " Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.760112 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-utilities" (OuterVolumeSpecName: "utilities") pod "a3f85efd-d5e1-45c6-9fa4-211ef9b477b4" (UID: "a3f85efd-d5e1-45c6-9fa4-211ef9b477b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.769892 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-kube-api-access-2v6wp" (OuterVolumeSpecName: "kube-api-access-2v6wp") pod "a3f85efd-d5e1-45c6-9fa4-211ef9b477b4" (UID: "a3f85efd-d5e1-45c6-9fa4-211ef9b477b4"). InnerVolumeSpecName "kube-api-access-2v6wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.780009 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.824195 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3f85efd-d5e1-45c6-9fa4-211ef9b477b4" (UID: "a3f85efd-d5e1-45c6-9fa4-211ef9b477b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.861871 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-serving-cert\") pod \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\" (UID: \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\") " Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.861937 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-config\") pod \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\" (UID: \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\") " Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.861972 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-client-ca\") pod \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\" (UID: \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\") " Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.862193 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksxqw\" (UniqueName: \"kubernetes.io/projected/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-kube-api-access-ksxqw\") pod \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\" (UID: \"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb\") " Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.862507 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v6wp\" (UniqueName: \"kubernetes.io/projected/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-kube-api-access-2v6wp\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.862529 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.862539 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.863993 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-client-ca" (OuterVolumeSpecName: "client-ca") pod "87ff0af1-a598-4872-8c0d-c8ec0a83a6bb" (UID: "87ff0af1-a598-4872-8c0d-c8ec0a83a6bb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.864135 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-config" (OuterVolumeSpecName: "config") pod "87ff0af1-a598-4872-8c0d-c8ec0a83a6bb" (UID: "87ff0af1-a598-4872-8c0d-c8ec0a83a6bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.865852 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "87ff0af1-a598-4872-8c0d-c8ec0a83a6bb" (UID: "87ff0af1-a598-4872-8c0d-c8ec0a83a6bb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.867819 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-kube-api-access-ksxqw" (OuterVolumeSpecName: "kube-api-access-ksxqw") pod "87ff0af1-a598-4872-8c0d-c8ec0a83a6bb" (UID: "87ff0af1-a598-4872-8c0d-c8ec0a83a6bb"). InnerVolumeSpecName "kube-api-access-ksxqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.874539 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.964332 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-948pc\" (UniqueName: \"kubernetes.io/projected/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-kube-api-access-948pc\") pod \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.964441 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-serving-cert\") pod \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.964656 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-proxy-ca-bundles\") pod \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.964754 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-client-ca\") pod \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.964817 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-config\") pod \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\" (UID: \"b90cf6f1-329a-4a94-bd6c-7b1cae105aea\") " Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.965245 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksxqw\" (UniqueName: \"kubernetes.io/projected/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-kube-api-access-ksxqw\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.965274 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.965296 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.965318 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.968040 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-kube-api-access-948pc" (OuterVolumeSpecName: "kube-api-access-948pc") pod "b90cf6f1-329a-4a94-bd6c-7b1cae105aea" (UID: "b90cf6f1-329a-4a94-bd6c-7b1cae105aea"). InnerVolumeSpecName "kube-api-access-948pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:52:17 crc kubenswrapper[4776]: I0128 06:52:17.968255 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b90cf6f1-329a-4a94-bd6c-7b1cae105aea" (UID: "b90cf6f1-329a-4a94-bd6c-7b1cae105aea"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.151677 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-948pc\" (UniqueName: \"kubernetes.io/projected/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-kube-api-access-948pc\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.151750 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.170801 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-client-ca" (OuterVolumeSpecName: "client-ca") pod "b90cf6f1-329a-4a94-bd6c-7b1cae105aea" (UID: "b90cf6f1-329a-4a94-bd6c-7b1cae105aea"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.171275 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-config" (OuterVolumeSpecName: "config") pod "b90cf6f1-329a-4a94-bd6c-7b1cae105aea" (UID: "b90cf6f1-329a-4a94-bd6c-7b1cae105aea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.174192 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b90cf6f1-329a-4a94-bd6c-7b1cae105aea" (UID: "b90cf6f1-329a-4a94-bd6c-7b1cae105aea"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.254714 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.254757 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.254772 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b90cf6f1-329a-4a94-bd6c-7b1cae105aea-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.493903 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2f2sb" event={"ID":"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff","Type":"ContainerStarted","Data":"cef28429e1bf4ed60139d522de3ddc3922a4e3b709dc2d51b4bcced9fcdbf8b6"} Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.496304 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmx8d" event={"ID":"40686731-ff76-403c-bbed-20ceaa786fbc","Type":"ContainerStarted","Data":"bec9abc58c8fe698064d57e46d69d6d4cc8ae9a2d74ea340e4733c6a8024cd8a"} Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.499752 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24xrp" event={"ID":"a3f85efd-d5e1-45c6-9fa4-211ef9b477b4","Type":"ContainerDied","Data":"647033e02ce16eae4f3fa2ec8cbb214b41bcfa410bf2c853d5b71b9bfae07982"} Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.499791 4776 scope.go:117] "RemoveContainer" containerID="16000ac6adf8c72e4096d8ba31b1f35e2a2ea833a4dcc6361376e833b66f200c" Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.499906 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24xrp" Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.515861 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" event={"ID":"b90cf6f1-329a-4a94-bd6c-7b1cae105aea","Type":"ContainerDied","Data":"bcef5f03ed8d2929481c4ecdf1497662eb7cc61f1f6fddfe27d21a460096d6f0"} Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.515871 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75d98599bf-2gk59" Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.528851 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8d9l4" event={"ID":"51afc3ef-b111-4228-859a-9ff98f2b5131","Type":"ContainerStarted","Data":"a755a020f150595a8680c0d8776ae5a9a5157e02ddf45ebaa1075d0b7b1d0a1c"} Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.530380 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" event={"ID":"87ff0af1-a598-4872-8c0d-c8ec0a83a6bb","Type":"ContainerDied","Data":"85a570788cf993f79771c573d798d927813fc4e253a674a0dbe71b536fd4e8fe"} Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.530480 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-954f97495-wtdns" Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.547133 4776 scope.go:117] "RemoveContainer" containerID="9e1e1229d1b0e27e73fa462a4ec6176fb0a87ab9ddc74f08a14570bd02cde323" Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.574257 4776 scope.go:117] "RemoveContainer" containerID="8a9a2537aaeeb6cdc478165556ecefb51a338a5bb5ee97f01c40223dcbb6edbd" Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.577957 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75d98599bf-2gk59"] Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.580986 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-75d98599bf-2gk59"] Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.591049 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-954f97495-wtdns"] Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.595364 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-954f97495-wtdns"] Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.612118 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-24xrp"] Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.612730 4776 scope.go:117] "RemoveContainer" containerID="c12173d53338f3b5f503fa5a1ebed6763fee73421f1ac8e4543ea9769ec3ea8d" Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.619494 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-24xrp"] Jan 28 06:52:18 crc kubenswrapper[4776]: I0128 06:52:18.627838 4776 scope.go:117] "RemoveContainer" containerID="cc9cb8bad264b1c1fbccb72bcbaeb10a4327cca4d625be33ee8bd2d21ad66ca8" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.316289 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87ff0af1-a598-4872-8c0d-c8ec0a83a6bb" path="/var/lib/kubelet/pods/87ff0af1-a598-4872-8c0d-c8ec0a83a6bb/volumes" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.318173 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3f85efd-d5e1-45c6-9fa4-211ef9b477b4" path="/var/lib/kubelet/pods/a3f85efd-d5e1-45c6-9fa4-211ef9b477b4/volumes" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.319426 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b90cf6f1-329a-4a94-bd6c-7b1cae105aea" path="/var/lib/kubelet/pods/b90cf6f1-329a-4a94-bd6c-7b1cae105aea/volumes" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.548205 4776 generic.go:334] "Generic (PLEG): container finished" podID="51afc3ef-b111-4228-859a-9ff98f2b5131" containerID="a755a020f150595a8680c0d8776ae5a9a5157e02ddf45ebaa1075d0b7b1d0a1c" exitCode=0 Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.548386 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8d9l4" event={"ID":"51afc3ef-b111-4228-859a-9ff98f2b5131","Type":"ContainerDied","Data":"a755a020f150595a8680c0d8776ae5a9a5157e02ddf45ebaa1075d0b7b1d0a1c"} Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.553486 4776 generic.go:334] "Generic (PLEG): container finished" podID="8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff" containerID="cef28429e1bf4ed60139d522de3ddc3922a4e3b709dc2d51b4bcced9fcdbf8b6" exitCode=0 Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.553590 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2f2sb" event={"ID":"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff","Type":"ContainerDied","Data":"cef28429e1bf4ed60139d522de3ddc3922a4e3b709dc2d51b4bcced9fcdbf8b6"} Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.557659 4776 generic.go:334] "Generic (PLEG): container finished" podID="40686731-ff76-403c-bbed-20ceaa786fbc" containerID="bec9abc58c8fe698064d57e46d69d6d4cc8ae9a2d74ea340e4733c6a8024cd8a" exitCode=0 Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.557826 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmx8d" event={"ID":"40686731-ff76-403c-bbed-20ceaa786fbc","Type":"ContainerDied","Data":"bec9abc58c8fe698064d57e46d69d6d4cc8ae9a2d74ea340e4733c6a8024cd8a"} Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.607763 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5cqq4" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.662045 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5cqq4" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.939696 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5557c5b59c-58mpc"] Jan 28 06:52:19 crc kubenswrapper[4776]: E0128 06:52:19.940060 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90cf6f1-329a-4a94-bd6c-7b1cae105aea" containerName="controller-manager" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.940075 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90cf6f1-329a-4a94-bd6c-7b1cae105aea" containerName="controller-manager" Jan 28 06:52:19 crc kubenswrapper[4776]: E0128 06:52:19.940098 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3f85efd-d5e1-45c6-9fa4-211ef9b477b4" containerName="registry-server" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.940106 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f85efd-d5e1-45c6-9fa4-211ef9b477b4" containerName="registry-server" Jan 28 06:52:19 crc kubenswrapper[4776]: E0128 06:52:19.940116 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3f85efd-d5e1-45c6-9fa4-211ef9b477b4" containerName="extract-content" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.940124 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f85efd-d5e1-45c6-9fa4-211ef9b477b4" containerName="extract-content" Jan 28 06:52:19 crc kubenswrapper[4776]: E0128 06:52:19.940138 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3f85efd-d5e1-45c6-9fa4-211ef9b477b4" containerName="extract-utilities" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.940163 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f85efd-d5e1-45c6-9fa4-211ef9b477b4" containerName="extract-utilities" Jan 28 06:52:19 crc kubenswrapper[4776]: E0128 06:52:19.940175 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87ff0af1-a598-4872-8c0d-c8ec0a83a6bb" containerName="route-controller-manager" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.940182 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ff0af1-a598-4872-8c0d-c8ec0a83a6bb" containerName="route-controller-manager" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.940307 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3f85efd-d5e1-45c6-9fa4-211ef9b477b4" containerName="registry-server" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.940327 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="87ff0af1-a598-4872-8c0d-c8ec0a83a6bb" containerName="route-controller-manager" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.940337 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="b90cf6f1-329a-4a94-bd6c-7b1cae105aea" containerName="controller-manager" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.940903 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.943279 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.943650 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.946051 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.946210 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.946366 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.949806 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.955376 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz"] Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.956436 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.960826 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.961346 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.961494 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.961733 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.961857 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.961975 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.962585 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.962596 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5557c5b59c-58mpc"] Jan 28 06:52:19 crc kubenswrapper[4776]: I0128 06:52:19.982333 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz"] Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.082585 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-serving-cert\") pod \"controller-manager-5557c5b59c-58mpc\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.082680 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-client-ca\") pod \"controller-manager-5557c5b59c-58mpc\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.083166 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-proxy-ca-bundles\") pod \"controller-manager-5557c5b59c-58mpc\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.083221 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-config\") pod \"controller-manager-5557c5b59c-58mpc\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.083304 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rbvg\" (UniqueName: \"kubernetes.io/projected/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-kube-api-access-9rbvg\") pod \"controller-manager-5557c5b59c-58mpc\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.083386 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-serving-cert\") pod \"route-controller-manager-57bb45c9d4-lnmgz\" (UID: \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\") " pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.083513 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-config\") pod \"route-controller-manager-57bb45c9d4-lnmgz\" (UID: \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\") " pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.083583 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-client-ca\") pod \"route-controller-manager-57bb45c9d4-lnmgz\" (UID: \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\") " pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.083623 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfg5p\" (UniqueName: \"kubernetes.io/projected/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-kube-api-access-zfg5p\") pod \"route-controller-manager-57bb45c9d4-lnmgz\" (UID: \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\") " pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.185481 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-serving-cert\") pod \"controller-manager-5557c5b59c-58mpc\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.185578 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-client-ca\") pod \"controller-manager-5557c5b59c-58mpc\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.185608 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-proxy-ca-bundles\") pod \"controller-manager-5557c5b59c-58mpc\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.185655 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-config\") pod \"controller-manager-5557c5b59c-58mpc\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.185687 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rbvg\" (UniqueName: \"kubernetes.io/projected/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-kube-api-access-9rbvg\") pod \"controller-manager-5557c5b59c-58mpc\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.185738 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-serving-cert\") pod \"route-controller-manager-57bb45c9d4-lnmgz\" (UID: \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\") " pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.185774 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-config\") pod \"route-controller-manager-57bb45c9d4-lnmgz\" (UID: \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\") " pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.185821 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-client-ca\") pod \"route-controller-manager-57bb45c9d4-lnmgz\" (UID: \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\") " pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.185847 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfg5p\" (UniqueName: \"kubernetes.io/projected/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-kube-api-access-zfg5p\") pod \"route-controller-manager-57bb45c9d4-lnmgz\" (UID: \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\") " pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.189146 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-client-ca\") pod \"route-controller-manager-57bb45c9d4-lnmgz\" (UID: \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\") " pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.189299 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-client-ca\") pod \"controller-manager-5557c5b59c-58mpc\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.189679 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-config\") pod \"route-controller-manager-57bb45c9d4-lnmgz\" (UID: \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\") " pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.191379 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-config\") pod \"controller-manager-5557c5b59c-58mpc\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.193199 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-proxy-ca-bundles\") pod \"controller-manager-5557c5b59c-58mpc\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.201420 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-serving-cert\") pod \"route-controller-manager-57bb45c9d4-lnmgz\" (UID: \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\") " pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.201766 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-serving-cert\") pod \"controller-manager-5557c5b59c-58mpc\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.213501 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rbvg\" (UniqueName: \"kubernetes.io/projected/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-kube-api-access-9rbvg\") pod \"controller-manager-5557c5b59c-58mpc\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.213839 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfg5p\" (UniqueName: \"kubernetes.io/projected/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-kube-api-access-zfg5p\") pod \"route-controller-manager-57bb45c9d4-lnmgz\" (UID: \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\") " pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.260638 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.276432 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.553474 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5557c5b59c-58mpc"] Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.574334 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8d9l4" event={"ID":"51afc3ef-b111-4228-859a-9ff98f2b5131","Type":"ContainerStarted","Data":"60670eb50bdaaced4b98516b62ada4122bfa05b373b5d856c75e432ebc7b4219"} Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.598987 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2f2sb" event={"ID":"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff","Type":"ContainerStarted","Data":"cff836b66f44f11374fe7a1347f99a4228d138ecb012710c80843194ec9cbc34"} Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.601978 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8d9l4" podStartSLOduration=2.3342427629999998 podStartE2EDuration="58.601959829s" podCreationTimestamp="2026-01-28 06:51:22 +0000 UTC" firstStartedPulling="2026-01-28 06:51:23.897976821 +0000 UTC m=+55.313636981" lastFinishedPulling="2026-01-28 06:52:20.165693887 +0000 UTC m=+111.581354047" observedRunningTime="2026-01-28 06:52:20.599978386 +0000 UTC m=+112.015638546" watchObservedRunningTime="2026-01-28 06:52:20.601959829 +0000 UTC m=+112.017619989" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.617478 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmx8d" event={"ID":"40686731-ff76-403c-bbed-20ceaa786fbc","Type":"ContainerStarted","Data":"3a269c56d58c9cf6dfc38a1364f552e93c75dc7d7a92f484c88c88b52f050cd1"} Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.631032 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2f2sb" podStartSLOduration=3.358266176 podStartE2EDuration="1m2.631010606s" podCreationTimestamp="2026-01-28 06:51:18 +0000 UTC" firstStartedPulling="2026-01-28 06:51:20.756304571 +0000 UTC m=+52.171964751" lastFinishedPulling="2026-01-28 06:52:20.029049021 +0000 UTC m=+111.444709181" observedRunningTime="2026-01-28 06:52:20.629834825 +0000 UTC m=+112.045494985" watchObservedRunningTime="2026-01-28 06:52:20.631010606 +0000 UTC m=+112.046670756" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.660308 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lmx8d" podStartSLOduration=2.552449538 podStartE2EDuration="58.660280969s" podCreationTimestamp="2026-01-28 06:51:22 +0000 UTC" firstStartedPulling="2026-01-28 06:51:23.911756717 +0000 UTC m=+55.327416877" lastFinishedPulling="2026-01-28 06:52:20.019588148 +0000 UTC m=+111.435248308" observedRunningTime="2026-01-28 06:52:20.652367208 +0000 UTC m=+112.068027368" watchObservedRunningTime="2026-01-28 06:52:20.660280969 +0000 UTC m=+112.075941129" Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.749695 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5cqq4"] Jan 28 06:52:20 crc kubenswrapper[4776]: I0128 06:52:20.793653 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz"] Jan 28 06:52:20 crc kubenswrapper[4776]: W0128 06:52:20.801514 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aabc684_aa8f_45bf_b0d7_9764d666d3a6.slice/crio-2578451f9f317991228d9ee8726c7305249544ee7efd04cfdabdc964b1157da7 WatchSource:0}: Error finding container 2578451f9f317991228d9ee8726c7305249544ee7efd04cfdabdc964b1157da7: Status 404 returned error can't find the container with id 2578451f9f317991228d9ee8726c7305249544ee7efd04cfdabdc964b1157da7 Jan 28 06:52:21 crc kubenswrapper[4776]: I0128 06:52:21.418310 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ndmtj" Jan 28 06:52:21 crc kubenswrapper[4776]: I0128 06:52:21.623905 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" event={"ID":"4aabc684-aa8f-45bf-b0d7-9764d666d3a6","Type":"ContainerStarted","Data":"be60d125de8dd9d33e7bf0cb52b1749d03e24498febedf4cf36acc520a0a5026"} Jan 28 06:52:21 crc kubenswrapper[4776]: I0128 06:52:21.624143 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" event={"ID":"4aabc684-aa8f-45bf-b0d7-9764d666d3a6","Type":"ContainerStarted","Data":"2578451f9f317991228d9ee8726c7305249544ee7efd04cfdabdc964b1157da7"} Jan 28 06:52:21 crc kubenswrapper[4776]: I0128 06:52:21.625031 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:21 crc kubenswrapper[4776]: I0128 06:52:21.626845 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5cqq4" podUID="f691d5a6-7d36-4834-8844-ccd5c12b6645" containerName="registry-server" containerID="cri-o://505b518df963d39877592c1bca3de5245d566d3029c91dfbfe05f0bde80200ae" gracePeriod=2 Jan 28 06:52:21 crc kubenswrapper[4776]: I0128 06:52:21.627443 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" event={"ID":"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f","Type":"ContainerStarted","Data":"196a8d1ae288bdace70e9ed06793c4b0bdcdd61602b0f6bd7d7dcfa2f514d30a"} Jan 28 06:52:21 crc kubenswrapper[4776]: I0128 06:52:21.627467 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:21 crc kubenswrapper[4776]: I0128 06:52:21.627476 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" event={"ID":"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f","Type":"ContainerStarted","Data":"09eb721bf365273d53b46523998b07a7219b3bf819b4e25d5f39a25d9cb95ad1"} Jan 28 06:52:21 crc kubenswrapper[4776]: I0128 06:52:21.635310 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:21 crc kubenswrapper[4776]: I0128 06:52:21.637235 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:21 crc kubenswrapper[4776]: I0128 06:52:21.647269 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" podStartSLOduration=8.647236554 podStartE2EDuration="8.647236554s" podCreationTimestamp="2026-01-28 06:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:52:21.644389548 +0000 UTC m=+113.060049708" watchObservedRunningTime="2026-01-28 06:52:21.647236554 +0000 UTC m=+113.062896714" Jan 28 06:52:21 crc kubenswrapper[4776]: I0128 06:52:21.695005 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" podStartSLOduration=8.694967661 podStartE2EDuration="8.694967661s" podCreationTimestamp="2026-01-28 06:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:52:21.693166144 +0000 UTC m=+113.108826304" watchObservedRunningTime="2026-01-28 06:52:21.694967661 +0000 UTC m=+113.110627821" Jan 28 06:52:21 crc kubenswrapper[4776]: I0128 06:52:21.814118 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-482wz" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.389946 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8d9l4" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.390008 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8d9l4" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.539613 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5cqq4" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.636127 4776 generic.go:334] "Generic (PLEG): container finished" podID="f691d5a6-7d36-4834-8844-ccd5c12b6645" containerID="505b518df963d39877592c1bca3de5245d566d3029c91dfbfe05f0bde80200ae" exitCode=0 Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.636190 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cqq4" event={"ID":"f691d5a6-7d36-4834-8844-ccd5c12b6645","Type":"ContainerDied","Data":"505b518df963d39877592c1bca3de5245d566d3029c91dfbfe05f0bde80200ae"} Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.636923 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cqq4" event={"ID":"f691d5a6-7d36-4834-8844-ccd5c12b6645","Type":"ContainerDied","Data":"f2678e37ab6826fe251f57f31516186d86ce434a851bb0255bfd86550e41c53e"} Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.636947 4776 scope.go:117] "RemoveContainer" containerID="505b518df963d39877592c1bca3de5245d566d3029c91dfbfe05f0bde80200ae" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.636191 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5cqq4" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.653440 4776 scope.go:117] "RemoveContainer" containerID="d7423d1ee77d981f9e4aa75c927e61f1506ccec88d0d9cb4d11f8dbbd8dd3e4c" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.656065 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f691d5a6-7d36-4834-8844-ccd5c12b6645-catalog-content\") pod \"f691d5a6-7d36-4834-8844-ccd5c12b6645\" (UID: \"f691d5a6-7d36-4834-8844-ccd5c12b6645\") " Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.656129 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztnwv\" (UniqueName: \"kubernetes.io/projected/f691d5a6-7d36-4834-8844-ccd5c12b6645-kube-api-access-ztnwv\") pod \"f691d5a6-7d36-4834-8844-ccd5c12b6645\" (UID: \"f691d5a6-7d36-4834-8844-ccd5c12b6645\") " Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.656213 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f691d5a6-7d36-4834-8844-ccd5c12b6645-utilities\") pod \"f691d5a6-7d36-4834-8844-ccd5c12b6645\" (UID: \"f691d5a6-7d36-4834-8844-ccd5c12b6645\") " Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.657060 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f691d5a6-7d36-4834-8844-ccd5c12b6645-utilities" (OuterVolumeSpecName: "utilities") pod "f691d5a6-7d36-4834-8844-ccd5c12b6645" (UID: "f691d5a6-7d36-4834-8844-ccd5c12b6645"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.657690 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f691d5a6-7d36-4834-8844-ccd5c12b6645-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.663791 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f691d5a6-7d36-4834-8844-ccd5c12b6645-kube-api-access-ztnwv" (OuterVolumeSpecName: "kube-api-access-ztnwv") pod "f691d5a6-7d36-4834-8844-ccd5c12b6645" (UID: "f691d5a6-7d36-4834-8844-ccd5c12b6645"). InnerVolumeSpecName "kube-api-access-ztnwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.692257 4776 scope.go:117] "RemoveContainer" containerID="20625a91add363a943887532bb5de9373dd0a889d3189fea1d6c5cd7adb91af4" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.716698 4776 scope.go:117] "RemoveContainer" containerID="505b518df963d39877592c1bca3de5245d566d3029c91dfbfe05f0bde80200ae" Jan 28 06:52:22 crc kubenswrapper[4776]: E0128 06:52:22.717296 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"505b518df963d39877592c1bca3de5245d566d3029c91dfbfe05f0bde80200ae\": container with ID starting with 505b518df963d39877592c1bca3de5245d566d3029c91dfbfe05f0bde80200ae not found: ID does not exist" containerID="505b518df963d39877592c1bca3de5245d566d3029c91dfbfe05f0bde80200ae" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.717338 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505b518df963d39877592c1bca3de5245d566d3029c91dfbfe05f0bde80200ae"} err="failed to get container status \"505b518df963d39877592c1bca3de5245d566d3029c91dfbfe05f0bde80200ae\": rpc error: code = NotFound desc = could not find container \"505b518df963d39877592c1bca3de5245d566d3029c91dfbfe05f0bde80200ae\": container with ID starting with 505b518df963d39877592c1bca3de5245d566d3029c91dfbfe05f0bde80200ae not found: ID does not exist" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.717365 4776 scope.go:117] "RemoveContainer" containerID="d7423d1ee77d981f9e4aa75c927e61f1506ccec88d0d9cb4d11f8dbbd8dd3e4c" Jan 28 06:52:22 crc kubenswrapper[4776]: E0128 06:52:22.717809 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7423d1ee77d981f9e4aa75c927e61f1506ccec88d0d9cb4d11f8dbbd8dd3e4c\": container with ID starting with d7423d1ee77d981f9e4aa75c927e61f1506ccec88d0d9cb4d11f8dbbd8dd3e4c not found: ID does not exist" containerID="d7423d1ee77d981f9e4aa75c927e61f1506ccec88d0d9cb4d11f8dbbd8dd3e4c" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.717838 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7423d1ee77d981f9e4aa75c927e61f1506ccec88d0d9cb4d11f8dbbd8dd3e4c"} err="failed to get container status \"d7423d1ee77d981f9e4aa75c927e61f1506ccec88d0d9cb4d11f8dbbd8dd3e4c\": rpc error: code = NotFound desc = could not find container \"d7423d1ee77d981f9e4aa75c927e61f1506ccec88d0d9cb4d11f8dbbd8dd3e4c\": container with ID starting with d7423d1ee77d981f9e4aa75c927e61f1506ccec88d0d9cb4d11f8dbbd8dd3e4c not found: ID does not exist" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.717855 4776 scope.go:117] "RemoveContainer" containerID="20625a91add363a943887532bb5de9373dd0a889d3189fea1d6c5cd7adb91af4" Jan 28 06:52:22 crc kubenswrapper[4776]: E0128 06:52:22.718087 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20625a91add363a943887532bb5de9373dd0a889d3189fea1d6c5cd7adb91af4\": container with ID starting with 20625a91add363a943887532bb5de9373dd0a889d3189fea1d6c5cd7adb91af4 not found: ID does not exist" containerID="20625a91add363a943887532bb5de9373dd0a889d3189fea1d6c5cd7adb91af4" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.718116 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20625a91add363a943887532bb5de9373dd0a889d3189fea1d6c5cd7adb91af4"} err="failed to get container status \"20625a91add363a943887532bb5de9373dd0a889d3189fea1d6c5cd7adb91af4\": rpc error: code = NotFound desc = could not find container \"20625a91add363a943887532bb5de9373dd0a889d3189fea1d6c5cd7adb91af4\": container with ID starting with 20625a91add363a943887532bb5de9373dd0a889d3189fea1d6c5cd7adb91af4 not found: ID does not exist" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.718430 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f691d5a6-7d36-4834-8844-ccd5c12b6645-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f691d5a6-7d36-4834-8844-ccd5c12b6645" (UID: "f691d5a6-7d36-4834-8844-ccd5c12b6645"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.759329 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f691d5a6-7d36-4834-8844-ccd5c12b6645-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.759371 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztnwv\" (UniqueName: \"kubernetes.io/projected/f691d5a6-7d36-4834-8844-ccd5c12b6645-kube-api-access-ztnwv\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.770370 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lmx8d" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.770411 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lmx8d" Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.964867 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5cqq4"] Jan 28 06:52:22 crc kubenswrapper[4776]: I0128 06:52:22.969467 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5cqq4"] Jan 28 06:52:23 crc kubenswrapper[4776]: I0128 06:52:23.313287 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f691d5a6-7d36-4834-8844-ccd5c12b6645" path="/var/lib/kubelet/pods/f691d5a6-7d36-4834-8844-ccd5c12b6645/volumes" Jan 28 06:52:23 crc kubenswrapper[4776]: I0128 06:52:23.434472 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8d9l4" podUID="51afc3ef-b111-4228-859a-9ff98f2b5131" containerName="registry-server" probeResult="failure" output=< Jan 28 06:52:23 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Jan 28 06:52:23 crc kubenswrapper[4776]: > Jan 28 06:52:23 crc kubenswrapper[4776]: I0128 06:52:23.813250 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lmx8d" podUID="40686731-ff76-403c-bbed-20ceaa786fbc" containerName="registry-server" probeResult="failure" output=< Jan 28 06:52:23 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Jan 28 06:52:23 crc kubenswrapper[4776]: > Jan 28 06:52:25 crc kubenswrapper[4776]: I0128 06:52:25.349141 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-482wz"] Jan 28 06:52:25 crc kubenswrapper[4776]: I0128 06:52:25.349500 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-482wz" podUID="f79a920c-ccfe-464a-afe3-26d89327d4d9" containerName="registry-server" containerID="cri-o://5c3e9066279d3b32527b32de586e8d3f2471b02104d389232979717c191bc2aa" gracePeriod=2 Jan 28 06:52:25 crc kubenswrapper[4776]: I0128 06:52:25.661278 4776 generic.go:334] "Generic (PLEG): container finished" podID="f79a920c-ccfe-464a-afe3-26d89327d4d9" containerID="5c3e9066279d3b32527b32de586e8d3f2471b02104d389232979717c191bc2aa" exitCode=0 Jan 28 06:52:25 crc kubenswrapper[4776]: I0128 06:52:25.661353 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-482wz" event={"ID":"f79a920c-ccfe-464a-afe3-26d89327d4d9","Type":"ContainerDied","Data":"5c3e9066279d3b32527b32de586e8d3f2471b02104d389232979717c191bc2aa"} Jan 28 06:52:25 crc kubenswrapper[4776]: I0128 06:52:25.825611 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-482wz" Jan 28 06:52:25 crc kubenswrapper[4776]: I0128 06:52:25.913020 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f79a920c-ccfe-464a-afe3-26d89327d4d9-utilities\") pod \"f79a920c-ccfe-464a-afe3-26d89327d4d9\" (UID: \"f79a920c-ccfe-464a-afe3-26d89327d4d9\") " Jan 28 06:52:25 crc kubenswrapper[4776]: I0128 06:52:25.913096 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-675ck\" (UniqueName: \"kubernetes.io/projected/f79a920c-ccfe-464a-afe3-26d89327d4d9-kube-api-access-675ck\") pod \"f79a920c-ccfe-464a-afe3-26d89327d4d9\" (UID: \"f79a920c-ccfe-464a-afe3-26d89327d4d9\") " Jan 28 06:52:25 crc kubenswrapper[4776]: I0128 06:52:25.913126 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f79a920c-ccfe-464a-afe3-26d89327d4d9-catalog-content\") pod \"f79a920c-ccfe-464a-afe3-26d89327d4d9\" (UID: \"f79a920c-ccfe-464a-afe3-26d89327d4d9\") " Jan 28 06:52:25 crc kubenswrapper[4776]: I0128 06:52:25.913894 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f79a920c-ccfe-464a-afe3-26d89327d4d9-utilities" (OuterVolumeSpecName: "utilities") pod "f79a920c-ccfe-464a-afe3-26d89327d4d9" (UID: "f79a920c-ccfe-464a-afe3-26d89327d4d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:52:25 crc kubenswrapper[4776]: I0128 06:52:25.932244 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f79a920c-ccfe-464a-afe3-26d89327d4d9-kube-api-access-675ck" (OuterVolumeSpecName: "kube-api-access-675ck") pod "f79a920c-ccfe-464a-afe3-26d89327d4d9" (UID: "f79a920c-ccfe-464a-afe3-26d89327d4d9"). InnerVolumeSpecName "kube-api-access-675ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:52:25 crc kubenswrapper[4776]: I0128 06:52:25.942750 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f79a920c-ccfe-464a-afe3-26d89327d4d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f79a920c-ccfe-464a-afe3-26d89327d4d9" (UID: "f79a920c-ccfe-464a-afe3-26d89327d4d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:52:26 crc kubenswrapper[4776]: I0128 06:52:26.014603 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f79a920c-ccfe-464a-afe3-26d89327d4d9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:26 crc kubenswrapper[4776]: I0128 06:52:26.014650 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f79a920c-ccfe-464a-afe3-26d89327d4d9-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:26 crc kubenswrapper[4776]: I0128 06:52:26.014663 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-675ck\" (UniqueName: \"kubernetes.io/projected/f79a920c-ccfe-464a-afe3-26d89327d4d9-kube-api-access-675ck\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:26 crc kubenswrapper[4776]: I0128 06:52:26.680423 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-482wz" event={"ID":"f79a920c-ccfe-464a-afe3-26d89327d4d9","Type":"ContainerDied","Data":"16f950981f35365962dc37c9985338d1d81eb6574798d7f702c538569b9fa218"} Jan 28 06:52:26 crc kubenswrapper[4776]: I0128 06:52:26.680579 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-482wz" Jan 28 06:52:26 crc kubenswrapper[4776]: I0128 06:52:26.681594 4776 scope.go:117] "RemoveContainer" containerID="5c3e9066279d3b32527b32de586e8d3f2471b02104d389232979717c191bc2aa" Jan 28 06:52:26 crc kubenswrapper[4776]: I0128 06:52:26.712482 4776 scope.go:117] "RemoveContainer" containerID="4efef5ab3f72369911fbfd7d0882ab0b7c842b7ba2c8587899da583a578ec22d" Jan 28 06:52:26 crc kubenswrapper[4776]: I0128 06:52:26.731176 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-482wz"] Jan 28 06:52:26 crc kubenswrapper[4776]: I0128 06:52:26.742979 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-482wz"] Jan 28 06:52:26 crc kubenswrapper[4776]: I0128 06:52:26.745857 4776 scope.go:117] "RemoveContainer" containerID="86e2a3c8066a8c300ecba7b2332b09bc5b2e4a4fd353086a9cc1aa23d736c950" Jan 28 06:52:27 crc kubenswrapper[4776]: I0128 06:52:27.281287 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wjm9m"] Jan 28 06:52:27 crc kubenswrapper[4776]: I0128 06:52:27.318813 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f79a920c-ccfe-464a-afe3-26d89327d4d9" path="/var/lib/kubelet/pods/f79a920c-ccfe-464a-afe3-26d89327d4d9/volumes" Jan 28 06:52:29 crc kubenswrapper[4776]: I0128 06:52:29.169220 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2f2sb" Jan 28 06:52:29 crc kubenswrapper[4776]: I0128 06:52:29.169676 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2f2sb" Jan 28 06:52:29 crc kubenswrapper[4776]: I0128 06:52:29.225018 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2f2sb" Jan 28 06:52:29 crc kubenswrapper[4776]: I0128 06:52:29.753244 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2f2sb" Jan 28 06:52:32 crc kubenswrapper[4776]: I0128 06:52:32.451895 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8d9l4" Jan 28 06:52:32 crc kubenswrapper[4776]: I0128 06:52:32.500630 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8d9l4" Jan 28 06:52:32 crc kubenswrapper[4776]: I0128 06:52:32.844831 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lmx8d" Jan 28 06:52:32 crc kubenswrapper[4776]: I0128 06:52:32.907055 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lmx8d" Jan 28 06:52:33 crc kubenswrapper[4776]: I0128 06:52:33.628983 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5557c5b59c-58mpc"] Jan 28 06:52:33 crc kubenswrapper[4776]: I0128 06:52:33.630041 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" podUID="d6ae0ed9-d818-4422-b6db-22b7afbdbc9f" containerName="controller-manager" containerID="cri-o://196a8d1ae288bdace70e9ed06793c4b0bdcdd61602b0f6bd7d7dcfa2f514d30a" gracePeriod=30 Jan 28 06:52:33 crc kubenswrapper[4776]: I0128 06:52:33.647383 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz"] Jan 28 06:52:33 crc kubenswrapper[4776]: I0128 06:52:33.647762 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" podUID="4aabc684-aa8f-45bf-b0d7-9764d666d3a6" containerName="route-controller-manager" containerID="cri-o://be60d125de8dd9d33e7bf0cb52b1749d03e24498febedf4cf36acc520a0a5026" gracePeriod=30 Jan 28 06:52:33 crc kubenswrapper[4776]: I0128 06:52:33.704031 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lmx8d"] Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.263876 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.322075 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.467476 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-client-ca\") pod \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.467568 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-serving-cert\") pod \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.467599 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-serving-cert\") pod \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\" (UID: \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\") " Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.467671 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-config\") pod \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.467699 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfg5p\" (UniqueName: \"kubernetes.io/projected/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-kube-api-access-zfg5p\") pod \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\" (UID: \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\") " Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.467733 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rbvg\" (UniqueName: \"kubernetes.io/projected/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-kube-api-access-9rbvg\") pod \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.467750 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-config\") pod \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\" (UID: \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\") " Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.467792 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-client-ca\") pod \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\" (UID: \"4aabc684-aa8f-45bf-b0d7-9764d666d3a6\") " Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.467807 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-proxy-ca-bundles\") pod \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\" (UID: \"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f\") " Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.468385 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-client-ca" (OuterVolumeSpecName: "client-ca") pod "d6ae0ed9-d818-4422-b6db-22b7afbdbc9f" (UID: "d6ae0ed9-d818-4422-b6db-22b7afbdbc9f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.468689 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.469230 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-config" (OuterVolumeSpecName: "config") pod "4aabc684-aa8f-45bf-b0d7-9764d666d3a6" (UID: "4aabc684-aa8f-45bf-b0d7-9764d666d3a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.469416 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-client-ca" (OuterVolumeSpecName: "client-ca") pod "4aabc684-aa8f-45bf-b0d7-9764d666d3a6" (UID: "4aabc684-aa8f-45bf-b0d7-9764d666d3a6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.469744 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d6ae0ed9-d818-4422-b6db-22b7afbdbc9f" (UID: "d6ae0ed9-d818-4422-b6db-22b7afbdbc9f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.470450 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-config" (OuterVolumeSpecName: "config") pod "d6ae0ed9-d818-4422-b6db-22b7afbdbc9f" (UID: "d6ae0ed9-d818-4422-b6db-22b7afbdbc9f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.475150 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-kube-api-access-zfg5p" (OuterVolumeSpecName: "kube-api-access-zfg5p") pod "4aabc684-aa8f-45bf-b0d7-9764d666d3a6" (UID: "4aabc684-aa8f-45bf-b0d7-9764d666d3a6"). InnerVolumeSpecName "kube-api-access-zfg5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.475375 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d6ae0ed9-d818-4422-b6db-22b7afbdbc9f" (UID: "d6ae0ed9-d818-4422-b6db-22b7afbdbc9f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.475511 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-kube-api-access-9rbvg" (OuterVolumeSpecName: "kube-api-access-9rbvg") pod "d6ae0ed9-d818-4422-b6db-22b7afbdbc9f" (UID: "d6ae0ed9-d818-4422-b6db-22b7afbdbc9f"). InnerVolumeSpecName "kube-api-access-9rbvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.475824 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4aabc684-aa8f-45bf-b0d7-9764d666d3a6" (UID: "4aabc684-aa8f-45bf-b0d7-9764d666d3a6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.569686 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfg5p\" (UniqueName: \"kubernetes.io/projected/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-kube-api-access-zfg5p\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.569736 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rbvg\" (UniqueName: \"kubernetes.io/projected/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-kube-api-access-9rbvg\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.569750 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.569762 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.569775 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.569786 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.569796 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aabc684-aa8f-45bf-b0d7-9764d666d3a6-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.569807 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f-config\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.743808 4776 generic.go:334] "Generic (PLEG): container finished" podID="d6ae0ed9-d818-4422-b6db-22b7afbdbc9f" containerID="196a8d1ae288bdace70e9ed06793c4b0bdcdd61602b0f6bd7d7dcfa2f514d30a" exitCode=0 Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.743877 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" event={"ID":"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f","Type":"ContainerDied","Data":"196a8d1ae288bdace70e9ed06793c4b0bdcdd61602b0f6bd7d7dcfa2f514d30a"} Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.743907 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" event={"ID":"d6ae0ed9-d818-4422-b6db-22b7afbdbc9f","Type":"ContainerDied","Data":"09eb721bf365273d53b46523998b07a7219b3bf819b4e25d5f39a25d9cb95ad1"} Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.743925 4776 scope.go:117] "RemoveContainer" containerID="196a8d1ae288bdace70e9ed06793c4b0bdcdd61602b0f6bd7d7dcfa2f514d30a" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.744031 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5557c5b59c-58mpc" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.748722 4776 generic.go:334] "Generic (PLEG): container finished" podID="4aabc684-aa8f-45bf-b0d7-9764d666d3a6" containerID="be60d125de8dd9d33e7bf0cb52b1749d03e24498febedf4cf36acc520a0a5026" exitCode=0 Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.748979 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lmx8d" podUID="40686731-ff76-403c-bbed-20ceaa786fbc" containerName="registry-server" containerID="cri-o://3a269c56d58c9cf6dfc38a1364f552e93c75dc7d7a92f484c88c88b52f050cd1" gracePeriod=2 Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.749337 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.751793 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" event={"ID":"4aabc684-aa8f-45bf-b0d7-9764d666d3a6","Type":"ContainerDied","Data":"be60d125de8dd9d33e7bf0cb52b1749d03e24498febedf4cf36acc520a0a5026"} Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.752030 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz" event={"ID":"4aabc684-aa8f-45bf-b0d7-9764d666d3a6","Type":"ContainerDied","Data":"2578451f9f317991228d9ee8726c7305249544ee7efd04cfdabdc964b1157da7"} Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.778356 4776 scope.go:117] "RemoveContainer" containerID="196a8d1ae288bdace70e9ed06793c4b0bdcdd61602b0f6bd7d7dcfa2f514d30a" Jan 28 06:52:34 crc kubenswrapper[4776]: E0128 06:52:34.778990 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"196a8d1ae288bdace70e9ed06793c4b0bdcdd61602b0f6bd7d7dcfa2f514d30a\": container with ID starting with 196a8d1ae288bdace70e9ed06793c4b0bdcdd61602b0f6bd7d7dcfa2f514d30a not found: ID does not exist" containerID="196a8d1ae288bdace70e9ed06793c4b0bdcdd61602b0f6bd7d7dcfa2f514d30a" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.779034 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"196a8d1ae288bdace70e9ed06793c4b0bdcdd61602b0f6bd7d7dcfa2f514d30a"} err="failed to get container status \"196a8d1ae288bdace70e9ed06793c4b0bdcdd61602b0f6bd7d7dcfa2f514d30a\": rpc error: code = NotFound desc = could not find container \"196a8d1ae288bdace70e9ed06793c4b0bdcdd61602b0f6bd7d7dcfa2f514d30a\": container with ID starting with 196a8d1ae288bdace70e9ed06793c4b0bdcdd61602b0f6bd7d7dcfa2f514d30a not found: ID does not exist" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.779064 4776 scope.go:117] "RemoveContainer" containerID="be60d125de8dd9d33e7bf0cb52b1749d03e24498febedf4cf36acc520a0a5026" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.797347 4776 scope.go:117] "RemoveContainer" containerID="be60d125de8dd9d33e7bf0cb52b1749d03e24498febedf4cf36acc520a0a5026" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.803332 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz"] Jan 28 06:52:34 crc kubenswrapper[4776]: E0128 06:52:34.803337 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be60d125de8dd9d33e7bf0cb52b1749d03e24498febedf4cf36acc520a0a5026\": container with ID starting with be60d125de8dd9d33e7bf0cb52b1749d03e24498febedf4cf36acc520a0a5026 not found: ID does not exist" containerID="be60d125de8dd9d33e7bf0cb52b1749d03e24498febedf4cf36acc520a0a5026" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.803423 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be60d125de8dd9d33e7bf0cb52b1749d03e24498febedf4cf36acc520a0a5026"} err="failed to get container status \"be60d125de8dd9d33e7bf0cb52b1749d03e24498febedf4cf36acc520a0a5026\": rpc error: code = NotFound desc = could not find container \"be60d125de8dd9d33e7bf0cb52b1749d03e24498febedf4cf36acc520a0a5026\": container with ID starting with be60d125de8dd9d33e7bf0cb52b1749d03e24498febedf4cf36acc520a0a5026 not found: ID does not exist" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.808154 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57bb45c9d4-lnmgz"] Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.820376 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5557c5b59c-58mpc"] Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.823203 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5557c5b59c-58mpc"] Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.956018 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b6445c4c6-djltd"] Jan 28 06:52:34 crc kubenswrapper[4776]: E0128 06:52:34.956799 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6ae0ed9-d818-4422-b6db-22b7afbdbc9f" containerName="controller-manager" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.956948 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6ae0ed9-d818-4422-b6db-22b7afbdbc9f" containerName="controller-manager" Jan 28 06:52:34 crc kubenswrapper[4776]: E0128 06:52:34.957055 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79a920c-ccfe-464a-afe3-26d89327d4d9" containerName="extract-content" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.957131 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79a920c-ccfe-464a-afe3-26d89327d4d9" containerName="extract-content" Jan 28 06:52:34 crc kubenswrapper[4776]: E0128 06:52:34.957206 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f691d5a6-7d36-4834-8844-ccd5c12b6645" containerName="registry-server" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.957303 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f691d5a6-7d36-4834-8844-ccd5c12b6645" containerName="registry-server" Jan 28 06:52:34 crc kubenswrapper[4776]: E0128 06:52:34.957375 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f691d5a6-7d36-4834-8844-ccd5c12b6645" containerName="extract-utilities" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.957442 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f691d5a6-7d36-4834-8844-ccd5c12b6645" containerName="extract-utilities" Jan 28 06:52:34 crc kubenswrapper[4776]: E0128 06:52:34.957529 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79a920c-ccfe-464a-afe3-26d89327d4d9" containerName="registry-server" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.957620 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79a920c-ccfe-464a-afe3-26d89327d4d9" containerName="registry-server" Jan 28 06:52:34 crc kubenswrapper[4776]: E0128 06:52:34.957697 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aabc684-aa8f-45bf-b0d7-9764d666d3a6" containerName="route-controller-manager" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.957780 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aabc684-aa8f-45bf-b0d7-9764d666d3a6" containerName="route-controller-manager" Jan 28 06:52:34 crc kubenswrapper[4776]: E0128 06:52:34.957870 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79a920c-ccfe-464a-afe3-26d89327d4d9" containerName="extract-utilities" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.957948 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79a920c-ccfe-464a-afe3-26d89327d4d9" containerName="extract-utilities" Jan 28 06:52:34 crc kubenswrapper[4776]: E0128 06:52:34.958020 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f691d5a6-7d36-4834-8844-ccd5c12b6645" containerName="extract-content" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.958104 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f691d5a6-7d36-4834-8844-ccd5c12b6645" containerName="extract-content" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.958319 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f691d5a6-7d36-4834-8844-ccd5c12b6645" containerName="registry-server" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.958409 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aabc684-aa8f-45bf-b0d7-9764d666d3a6" containerName="route-controller-manager" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.958484 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f79a920c-ccfe-464a-afe3-26d89327d4d9" containerName="registry-server" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.958575 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6ae0ed9-d818-4422-b6db-22b7afbdbc9f" containerName="controller-manager" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.959247 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh"] Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.959610 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.960301 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.963192 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.965653 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.965940 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.966073 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.967612 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.967761 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.968049 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 06:52:34 crc kubenswrapper[4776]: I0128 06:52:34.969065 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:34.970161 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:34.970308 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:34.970634 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:34.971469 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:34.975705 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:34.979919 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b6445c4c6-djltd"] Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.026454 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh"] Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.080799 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8838ccf-1302-4a95-a412-886c6925e929-client-ca\") pod \"route-controller-manager-fbb6b8468-s89vh\" (UID: \"c8838ccf-1302-4a95-a412-886c6925e929\") " pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.080896 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db8114d8-0e82-4eea-8f69-4fc877795ee4-proxy-ca-bundles\") pod \"controller-manager-7b6445c4c6-djltd\" (UID: \"db8114d8-0e82-4eea-8f69-4fc877795ee4\") " pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.080927 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk57r\" (UniqueName: \"kubernetes.io/projected/c8838ccf-1302-4a95-a412-886c6925e929-kube-api-access-nk57r\") pod \"route-controller-manager-fbb6b8468-s89vh\" (UID: \"c8838ccf-1302-4a95-a412-886c6925e929\") " pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.080950 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db8114d8-0e82-4eea-8f69-4fc877795ee4-client-ca\") pod \"controller-manager-7b6445c4c6-djltd\" (UID: \"db8114d8-0e82-4eea-8f69-4fc877795ee4\") " pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.080967 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvz4l\" (UniqueName: \"kubernetes.io/projected/db8114d8-0e82-4eea-8f69-4fc877795ee4-kube-api-access-wvz4l\") pod \"controller-manager-7b6445c4c6-djltd\" (UID: \"db8114d8-0e82-4eea-8f69-4fc877795ee4\") " pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.081224 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8838ccf-1302-4a95-a412-886c6925e929-config\") pod \"route-controller-manager-fbb6b8468-s89vh\" (UID: \"c8838ccf-1302-4a95-a412-886c6925e929\") " pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.081348 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db8114d8-0e82-4eea-8f69-4fc877795ee4-serving-cert\") pod \"controller-manager-7b6445c4c6-djltd\" (UID: \"db8114d8-0e82-4eea-8f69-4fc877795ee4\") " pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.081434 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8114d8-0e82-4eea-8f69-4fc877795ee4-config\") pod \"controller-manager-7b6445c4c6-djltd\" (UID: \"db8114d8-0e82-4eea-8f69-4fc877795ee4\") " pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.081475 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8838ccf-1302-4a95-a412-886c6925e929-serving-cert\") pod \"route-controller-manager-fbb6b8468-s89vh\" (UID: \"c8838ccf-1302-4a95-a412-886c6925e929\") " pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.182869 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk57r\" (UniqueName: \"kubernetes.io/projected/c8838ccf-1302-4a95-a412-886c6925e929-kube-api-access-nk57r\") pod \"route-controller-manager-fbb6b8468-s89vh\" (UID: \"c8838ccf-1302-4a95-a412-886c6925e929\") " pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.182929 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db8114d8-0e82-4eea-8f69-4fc877795ee4-proxy-ca-bundles\") pod \"controller-manager-7b6445c4c6-djltd\" (UID: \"db8114d8-0e82-4eea-8f69-4fc877795ee4\") " pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.182960 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db8114d8-0e82-4eea-8f69-4fc877795ee4-client-ca\") pod \"controller-manager-7b6445c4c6-djltd\" (UID: \"db8114d8-0e82-4eea-8f69-4fc877795ee4\") " pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.182983 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvz4l\" (UniqueName: \"kubernetes.io/projected/db8114d8-0e82-4eea-8f69-4fc877795ee4-kube-api-access-wvz4l\") pod \"controller-manager-7b6445c4c6-djltd\" (UID: \"db8114d8-0e82-4eea-8f69-4fc877795ee4\") " pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.183024 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8838ccf-1302-4a95-a412-886c6925e929-config\") pod \"route-controller-manager-fbb6b8468-s89vh\" (UID: \"c8838ccf-1302-4a95-a412-886c6925e929\") " pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.183057 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db8114d8-0e82-4eea-8f69-4fc877795ee4-serving-cert\") pod \"controller-manager-7b6445c4c6-djltd\" (UID: \"db8114d8-0e82-4eea-8f69-4fc877795ee4\") " pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.183088 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8114d8-0e82-4eea-8f69-4fc877795ee4-config\") pod \"controller-manager-7b6445c4c6-djltd\" (UID: \"db8114d8-0e82-4eea-8f69-4fc877795ee4\") " pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.183109 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8838ccf-1302-4a95-a412-886c6925e929-serving-cert\") pod \"route-controller-manager-fbb6b8468-s89vh\" (UID: \"c8838ccf-1302-4a95-a412-886c6925e929\") " pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.183158 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8838ccf-1302-4a95-a412-886c6925e929-client-ca\") pod \"route-controller-manager-fbb6b8468-s89vh\" (UID: \"c8838ccf-1302-4a95-a412-886c6925e929\") " pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.184302 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8838ccf-1302-4a95-a412-886c6925e929-client-ca\") pod \"route-controller-manager-fbb6b8468-s89vh\" (UID: \"c8838ccf-1302-4a95-a412-886c6925e929\") " pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.184656 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8838ccf-1302-4a95-a412-886c6925e929-config\") pod \"route-controller-manager-fbb6b8468-s89vh\" (UID: \"c8838ccf-1302-4a95-a412-886c6925e929\") " pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.186309 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db8114d8-0e82-4eea-8f69-4fc877795ee4-proxy-ca-bundles\") pod \"controller-manager-7b6445c4c6-djltd\" (UID: \"db8114d8-0e82-4eea-8f69-4fc877795ee4\") " pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.187195 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db8114d8-0e82-4eea-8f69-4fc877795ee4-client-ca\") pod \"controller-manager-7b6445c4c6-djltd\" (UID: \"db8114d8-0e82-4eea-8f69-4fc877795ee4\") " pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.189182 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8114d8-0e82-4eea-8f69-4fc877795ee4-config\") pod \"controller-manager-7b6445c4c6-djltd\" (UID: \"db8114d8-0e82-4eea-8f69-4fc877795ee4\") " pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.194593 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db8114d8-0e82-4eea-8f69-4fc877795ee4-serving-cert\") pod \"controller-manager-7b6445c4c6-djltd\" (UID: \"db8114d8-0e82-4eea-8f69-4fc877795ee4\") " pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.195412 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8838ccf-1302-4a95-a412-886c6925e929-serving-cert\") pod \"route-controller-manager-fbb6b8468-s89vh\" (UID: \"c8838ccf-1302-4a95-a412-886c6925e929\") " pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.209326 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk57r\" (UniqueName: \"kubernetes.io/projected/c8838ccf-1302-4a95-a412-886c6925e929-kube-api-access-nk57r\") pod \"route-controller-manager-fbb6b8468-s89vh\" (UID: \"c8838ccf-1302-4a95-a412-886c6925e929\") " pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.225111 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvz4l\" (UniqueName: \"kubernetes.io/projected/db8114d8-0e82-4eea-8f69-4fc877795ee4-kube-api-access-wvz4l\") pod \"controller-manager-7b6445c4c6-djltd\" (UID: \"db8114d8-0e82-4eea-8f69-4fc877795ee4\") " pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.262626 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmx8d" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.287293 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40686731-ff76-403c-bbed-20ceaa786fbc-catalog-content\") pod \"40686731-ff76-403c-bbed-20ceaa786fbc\" (UID: \"40686731-ff76-403c-bbed-20ceaa786fbc\") " Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.287420 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40686731-ff76-403c-bbed-20ceaa786fbc-utilities\") pod \"40686731-ff76-403c-bbed-20ceaa786fbc\" (UID: \"40686731-ff76-403c-bbed-20ceaa786fbc\") " Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.287446 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52zpk\" (UniqueName: \"kubernetes.io/projected/40686731-ff76-403c-bbed-20ceaa786fbc-kube-api-access-52zpk\") pod \"40686731-ff76-403c-bbed-20ceaa786fbc\" (UID: \"40686731-ff76-403c-bbed-20ceaa786fbc\") " Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.289920 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40686731-ff76-403c-bbed-20ceaa786fbc-utilities" (OuterVolumeSpecName: "utilities") pod "40686731-ff76-403c-bbed-20ceaa786fbc" (UID: "40686731-ff76-403c-bbed-20ceaa786fbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.291902 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40686731-ff76-403c-bbed-20ceaa786fbc-kube-api-access-52zpk" (OuterVolumeSpecName: "kube-api-access-52zpk") pod "40686731-ff76-403c-bbed-20ceaa786fbc" (UID: "40686731-ff76-403c-bbed-20ceaa786fbc"). InnerVolumeSpecName "kube-api-access-52zpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.311578 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aabc684-aa8f-45bf-b0d7-9764d666d3a6" path="/var/lib/kubelet/pods/4aabc684-aa8f-45bf-b0d7-9764d666d3a6/volumes" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.312848 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6ae0ed9-d818-4422-b6db-22b7afbdbc9f" path="/var/lib/kubelet/pods/d6ae0ed9-d818-4422-b6db-22b7afbdbc9f/volumes" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.318811 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.329333 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.390920 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40686731-ff76-403c-bbed-20ceaa786fbc-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.390961 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52zpk\" (UniqueName: \"kubernetes.io/projected/40686731-ff76-403c-bbed-20ceaa786fbc-kube-api-access-52zpk\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.429373 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40686731-ff76-403c-bbed-20ceaa786fbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40686731-ff76-403c-bbed-20ceaa786fbc" (UID: "40686731-ff76-403c-bbed-20ceaa786fbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.492452 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40686731-ff76-403c-bbed-20ceaa786fbc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.536436 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b6445c4c6-djltd"] Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.762543 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" event={"ID":"db8114d8-0e82-4eea-8f69-4fc877795ee4","Type":"ContainerStarted","Data":"6c8b90379ba2cd8ec4934796c99f07ef7cc73acfe420efcc51d028af80093a37"} Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.763295 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" event={"ID":"db8114d8-0e82-4eea-8f69-4fc877795ee4","Type":"ContainerStarted","Data":"e351d5df2a44dff99c2c071263187ba35e06a5b1171beb7a94d9d6c21791aba8"} Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.763353 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.766280 4776 patch_prober.go:28] interesting pod/controller-manager-7b6445c4c6-djltd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" start-of-body= Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.766342 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" podUID="db8114d8-0e82-4eea-8f69-4fc877795ee4" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.768495 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmx8d" event={"ID":"40686731-ff76-403c-bbed-20ceaa786fbc","Type":"ContainerDied","Data":"3a269c56d58c9cf6dfc38a1364f552e93c75dc7d7a92f484c88c88b52f050cd1"} Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.768521 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmx8d" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.768593 4776 scope.go:117] "RemoveContainer" containerID="3a269c56d58c9cf6dfc38a1364f552e93c75dc7d7a92f484c88c88b52f050cd1" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.769504 4776 generic.go:334] "Generic (PLEG): container finished" podID="40686731-ff76-403c-bbed-20ceaa786fbc" containerID="3a269c56d58c9cf6dfc38a1364f552e93c75dc7d7a92f484c88c88b52f050cd1" exitCode=0 Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.769629 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmx8d" event={"ID":"40686731-ff76-403c-bbed-20ceaa786fbc","Type":"ContainerDied","Data":"721fed48d452dae562c1e4f719a33c02100a305c31c43862e0d3919720baff6c"} Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.793159 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" podStartSLOduration=2.793124596 podStartE2EDuration="2.793124596s" podCreationTimestamp="2026-01-28 06:52:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:52:35.789433017 +0000 UTC m=+127.205093177" watchObservedRunningTime="2026-01-28 06:52:35.793124596 +0000 UTC m=+127.208784776" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.794134 4776 scope.go:117] "RemoveContainer" containerID="bec9abc58c8fe698064d57e46d69d6d4cc8ae9a2d74ea340e4733c6a8024cd8a" Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.812662 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lmx8d"] Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.816800 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lmx8d"] Jan 28 06:52:35 crc kubenswrapper[4776]: I0128 06:52:35.845520 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh"] Jan 28 06:52:36 crc kubenswrapper[4776]: W0128 06:52:36.183953 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8838ccf_1302_4a95_a412_886c6925e929.slice/crio-40b738249f19895ddd92f19426103f83618c910bc7535a8a810a3c070efadfd2 WatchSource:0}: Error finding container 40b738249f19895ddd92f19426103f83618c910bc7535a8a810a3c070efadfd2: Status 404 returned error can't find the container with id 40b738249f19895ddd92f19426103f83618c910bc7535a8a810a3c070efadfd2 Jan 28 06:52:36 crc kubenswrapper[4776]: I0128 06:52:36.197054 4776 scope.go:117] "RemoveContainer" containerID="b560bbe2547c35d355e73467ebfecb0206768604eed88226f9108ee6a18be2a1" Jan 28 06:52:36 crc kubenswrapper[4776]: I0128 06:52:36.220218 4776 scope.go:117] "RemoveContainer" containerID="3a269c56d58c9cf6dfc38a1364f552e93c75dc7d7a92f484c88c88b52f050cd1" Jan 28 06:52:36 crc kubenswrapper[4776]: E0128 06:52:36.221045 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a269c56d58c9cf6dfc38a1364f552e93c75dc7d7a92f484c88c88b52f050cd1\": container with ID starting with 3a269c56d58c9cf6dfc38a1364f552e93c75dc7d7a92f484c88c88b52f050cd1 not found: ID does not exist" containerID="3a269c56d58c9cf6dfc38a1364f552e93c75dc7d7a92f484c88c88b52f050cd1" Jan 28 06:52:36 crc kubenswrapper[4776]: I0128 06:52:36.221100 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a269c56d58c9cf6dfc38a1364f552e93c75dc7d7a92f484c88c88b52f050cd1"} err="failed to get container status \"3a269c56d58c9cf6dfc38a1364f552e93c75dc7d7a92f484c88c88b52f050cd1\": rpc error: code = NotFound desc = could not find container \"3a269c56d58c9cf6dfc38a1364f552e93c75dc7d7a92f484c88c88b52f050cd1\": container with ID starting with 3a269c56d58c9cf6dfc38a1364f552e93c75dc7d7a92f484c88c88b52f050cd1 not found: ID does not exist" Jan 28 06:52:36 crc kubenswrapper[4776]: I0128 06:52:36.221127 4776 scope.go:117] "RemoveContainer" containerID="bec9abc58c8fe698064d57e46d69d6d4cc8ae9a2d74ea340e4733c6a8024cd8a" Jan 28 06:52:36 crc kubenswrapper[4776]: E0128 06:52:36.221844 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bec9abc58c8fe698064d57e46d69d6d4cc8ae9a2d74ea340e4733c6a8024cd8a\": container with ID starting with bec9abc58c8fe698064d57e46d69d6d4cc8ae9a2d74ea340e4733c6a8024cd8a not found: ID does not exist" containerID="bec9abc58c8fe698064d57e46d69d6d4cc8ae9a2d74ea340e4733c6a8024cd8a" Jan 28 06:52:36 crc kubenswrapper[4776]: I0128 06:52:36.221877 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bec9abc58c8fe698064d57e46d69d6d4cc8ae9a2d74ea340e4733c6a8024cd8a"} err="failed to get container status \"bec9abc58c8fe698064d57e46d69d6d4cc8ae9a2d74ea340e4733c6a8024cd8a\": rpc error: code = NotFound desc = could not find container \"bec9abc58c8fe698064d57e46d69d6d4cc8ae9a2d74ea340e4733c6a8024cd8a\": container with ID starting with bec9abc58c8fe698064d57e46d69d6d4cc8ae9a2d74ea340e4733c6a8024cd8a not found: ID does not exist" Jan 28 06:52:36 crc kubenswrapper[4776]: I0128 06:52:36.221892 4776 scope.go:117] "RemoveContainer" containerID="b560bbe2547c35d355e73467ebfecb0206768604eed88226f9108ee6a18be2a1" Jan 28 06:52:36 crc kubenswrapper[4776]: E0128 06:52:36.222192 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b560bbe2547c35d355e73467ebfecb0206768604eed88226f9108ee6a18be2a1\": container with ID starting with b560bbe2547c35d355e73467ebfecb0206768604eed88226f9108ee6a18be2a1 not found: ID does not exist" containerID="b560bbe2547c35d355e73467ebfecb0206768604eed88226f9108ee6a18be2a1" Jan 28 06:52:36 crc kubenswrapper[4776]: I0128 06:52:36.222233 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b560bbe2547c35d355e73467ebfecb0206768604eed88226f9108ee6a18be2a1"} err="failed to get container status \"b560bbe2547c35d355e73467ebfecb0206768604eed88226f9108ee6a18be2a1\": rpc error: code = NotFound desc = could not find container \"b560bbe2547c35d355e73467ebfecb0206768604eed88226f9108ee6a18be2a1\": container with ID starting with b560bbe2547c35d355e73467ebfecb0206768604eed88226f9108ee6a18be2a1 not found: ID does not exist" Jan 28 06:52:36 crc kubenswrapper[4776]: I0128 06:52:36.782871 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" event={"ID":"c8838ccf-1302-4a95-a412-886c6925e929","Type":"ContainerStarted","Data":"3fe7bb66580758e8d0309f025fa360eef27f6f532696b8beaaf01ce001bcca5d"} Jan 28 06:52:36 crc kubenswrapper[4776]: I0128 06:52:36.783464 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" event={"ID":"c8838ccf-1302-4a95-a412-886c6925e929","Type":"ContainerStarted","Data":"40b738249f19895ddd92f19426103f83618c910bc7535a8a810a3c070efadfd2"} Jan 28 06:52:36 crc kubenswrapper[4776]: I0128 06:52:36.783498 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" Jan 28 06:52:36 crc kubenswrapper[4776]: I0128 06:52:36.789091 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b6445c4c6-djltd" Jan 28 06:52:36 crc kubenswrapper[4776]: I0128 06:52:36.789608 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" Jan 28 06:52:36 crc kubenswrapper[4776]: I0128 06:52:36.805313 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-fbb6b8468-s89vh" podStartSLOduration=3.805282015 podStartE2EDuration="3.805282015s" podCreationTimestamp="2026-01-28 06:52:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:52:36.804894255 +0000 UTC m=+128.220554445" watchObservedRunningTime="2026-01-28 06:52:36.805282015 +0000 UTC m=+128.220942225" Jan 28 06:52:37 crc kubenswrapper[4776]: I0128 06:52:37.313021 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40686731-ff76-403c-bbed-20ceaa786fbc" path="/var/lib/kubelet/pods/40686731-ff76-403c-bbed-20ceaa786fbc/volumes" Jan 28 06:52:45 crc kubenswrapper[4776]: I0128 06:52:45.993075 4776 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 28 06:52:45 crc kubenswrapper[4776]: I0128 06:52:45.994216 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36" gracePeriod=15 Jan 28 06:52:45 crc kubenswrapper[4776]: I0128 06:52:45.994254 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://c2b1f771b5dea98c61f6affa0c5fabc211fba22627c2c44b77802a6114621eea" gracePeriod=15 Jan 28 06:52:45 crc kubenswrapper[4776]: I0128 06:52:45.994370 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6" gracePeriod=15 Jan 28 06:52:45 crc kubenswrapper[4776]: I0128 06:52:45.994415 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba" gracePeriod=15 Jan 28 06:52:45 crc kubenswrapper[4776]: I0128 06:52:45.994622 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef" gracePeriod=15 Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.077630 4776 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.080620 4776 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.080985 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081008 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.081025 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081035 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.081044 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40686731-ff76-403c-bbed-20ceaa786fbc" containerName="registry-server" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081053 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="40686731-ff76-403c-bbed-20ceaa786fbc" containerName="registry-server" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.081063 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081070 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.081083 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081089 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.081099 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081107 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.081114 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40686731-ff76-403c-bbed-20ceaa786fbc" containerName="extract-utilities" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081120 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="40686731-ff76-403c-bbed-20ceaa786fbc" containerName="extract-utilities" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.081128 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081134 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.081151 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081157 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.081164 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40686731-ff76-403c-bbed-20ceaa786fbc" containerName="extract-content" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081170 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="40686731-ff76-403c-bbed-20ceaa786fbc" containerName="extract-content" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081280 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="40686731-ff76-403c-bbed-20ceaa786fbc" containerName="registry-server" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081313 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081321 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081331 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081339 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081349 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081359 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.081478 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081488 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.081601 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.082813 4776 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.083314 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.084853 4776 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.100571 4776 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.195:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.250976 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.251072 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.251191 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.251227 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.251269 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.251384 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.251528 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.251636 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.353586 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.354053 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.354237 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.354141 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.354459 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.353830 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.354314 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.354697 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.354743 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.354818 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.354919 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.354993 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.355011 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.355083 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.354581 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.355265 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.401688 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: W0128 06:52:46.429119 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-49075cbaba8e77edd19218c889e91989fd66cd6954b1c4bf668db546acc71f70 WatchSource:0}: Error finding container 49075cbaba8e77edd19218c889e91989fd66cd6954b1c4bf668db546acc71f70: Status 404 returned error can't find the container with id 49075cbaba8e77edd19218c889e91989fd66cd6954b1c4bf668db546acc71f70 Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.433587 4776 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.195:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188ed283a3b4a225 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-28 06:52:46.432256549 +0000 UTC m=+137.847916719,LastTimestamp:2026-01-28 06:52:46.432256549 +0000 UTC m=+137.847916719,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.864295 4776 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.865678 4776 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.866275 4776 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.866665 4776 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.867084 4776 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.867129 4776 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.867446 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="200ms" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.876088 4776 generic.go:334] "Generic (PLEG): container finished" podID="95d1a963-b856-4c93-b148-c90d7fd98582" containerID="d766d9b821f0459a36cadb5c984085744f95c90dd6f2930cd70892bab189732f" exitCode=0 Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.876184 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"95d1a963-b856-4c93-b148-c90d7fd98582","Type":"ContainerDied","Data":"d766d9b821f0459a36cadb5c984085744f95c90dd6f2930cd70892bab189732f"} Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.876849 4776 status_manager.go:851] "Failed to get status for pod" podUID="95d1a963-b856-4c93-b148-c90d7fd98582" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.877314 4776 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.879241 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"9e90520a82f3c59bfb301a8dde330e44eeb745a236261d58ef42ef815cf62852"} Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.879322 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"49075cbaba8e77edd19218c889e91989fd66cd6954b1c4bf668db546acc71f70"} Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.880059 4776 status_manager.go:851] "Failed to get status for pod" podUID="95d1a963-b856-4c93-b148-c90d7fd98582" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:46 crc kubenswrapper[4776]: E0128 06:52:46.880285 4776 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.195:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.880394 4776 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.883734 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.886055 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.887090 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c2b1f771b5dea98c61f6affa0c5fabc211fba22627c2c44b77802a6114621eea" exitCode=0 Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.887198 4776 scope.go:117] "RemoveContainer" containerID="84b82ffbdd42a74796b09a3d84e2471eb60c7b500ecc54b8b4be4eb863808d17" Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.887259 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6" exitCode=0 Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.887377 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba" exitCode=0 Jan 28 06:52:46 crc kubenswrapper[4776]: I0128 06:52:46.887397 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef" exitCode=2 Jan 28 06:52:47 crc kubenswrapper[4776]: E0128 06:52:47.068356 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="400ms" Jan 28 06:52:47 crc kubenswrapper[4776]: E0128 06:52:47.469894 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="800ms" Jan 28 06:52:47 crc kubenswrapper[4776]: I0128 06:52:47.902325 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 28 06:52:48 crc kubenswrapper[4776]: E0128 06:52:48.270927 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="1.6s" Jan 28 06:52:48 crc kubenswrapper[4776]: I0128 06:52:48.450596 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 28 06:52:48 crc kubenswrapper[4776]: I0128 06:52:48.451736 4776 status_manager.go:851] "Failed to get status for pod" podUID="95d1a963-b856-4c93-b148-c90d7fd98582" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:48 crc kubenswrapper[4776]: I0128 06:52:48.600721 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/95d1a963-b856-4c93-b148-c90d7fd98582-kubelet-dir\") pod \"95d1a963-b856-4c93-b148-c90d7fd98582\" (UID: \"95d1a963-b856-4c93-b148-c90d7fd98582\") " Jan 28 06:52:48 crc kubenswrapper[4776]: I0128 06:52:48.600755 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95d1a963-b856-4c93-b148-c90d7fd98582-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "95d1a963-b856-4c93-b148-c90d7fd98582" (UID: "95d1a963-b856-4c93-b148-c90d7fd98582"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 06:52:48 crc kubenswrapper[4776]: I0128 06:52:48.600835 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95d1a963-b856-4c93-b148-c90d7fd98582-kube-api-access\") pod \"95d1a963-b856-4c93-b148-c90d7fd98582\" (UID: \"95d1a963-b856-4c93-b148-c90d7fd98582\") " Jan 28 06:52:48 crc kubenswrapper[4776]: I0128 06:52:48.601032 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/95d1a963-b856-4c93-b148-c90d7fd98582-var-lock\") pod \"95d1a963-b856-4c93-b148-c90d7fd98582\" (UID: \"95d1a963-b856-4c93-b148-c90d7fd98582\") " Jan 28 06:52:48 crc kubenswrapper[4776]: I0128 06:52:48.601133 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95d1a963-b856-4c93-b148-c90d7fd98582-var-lock" (OuterVolumeSpecName: "var-lock") pod "95d1a963-b856-4c93-b148-c90d7fd98582" (UID: "95d1a963-b856-4c93-b148-c90d7fd98582"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 06:52:48 crc kubenswrapper[4776]: I0128 06:52:48.601586 4776 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/95d1a963-b856-4c93-b148-c90d7fd98582-var-lock\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:48 crc kubenswrapper[4776]: I0128 06:52:48.601631 4776 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/95d1a963-b856-4c93-b148-c90d7fd98582-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:48 crc kubenswrapper[4776]: I0128 06:52:48.606513 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d1a963-b856-4c93-b148-c90d7fd98582-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "95d1a963-b856-4c93-b148-c90d7fd98582" (UID: "95d1a963-b856-4c93-b148-c90d7fd98582"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:52:48 crc kubenswrapper[4776]: I0128 06:52:48.702958 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95d1a963-b856-4c93-b148-c90d7fd98582-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:48 crc kubenswrapper[4776]: I0128 06:52:48.914577 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"95d1a963-b856-4c93-b148-c90d7fd98582","Type":"ContainerDied","Data":"e3324534f826734f17b1d5bb3069fdfd9ae3d7a79cf6ae6990033781bfa47213"} Jan 28 06:52:48 crc kubenswrapper[4776]: I0128 06:52:48.914646 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3324534f826734f17b1d5bb3069fdfd9ae3d7a79cf6ae6990033781bfa47213" Jan 28 06:52:48 crc kubenswrapper[4776]: I0128 06:52:48.915164 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 28 06:52:48 crc kubenswrapper[4776]: I0128 06:52:48.919331 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 28 06:52:48 crc kubenswrapper[4776]: I0128 06:52:48.920353 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36" exitCode=0 Jan 28 06:52:48 crc kubenswrapper[4776]: I0128 06:52:48.943070 4776 status_manager.go:851] "Failed to get status for pod" podUID="95d1a963-b856-4c93-b148-c90d7fd98582" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.033536 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.034855 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.035607 4776 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.036369 4776 status_manager.go:851] "Failed to get status for pod" podUID="95d1a963-b856-4c93-b148-c90d7fd98582" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.211126 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.211287 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.211275 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.211344 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.211409 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.211580 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.211824 4776 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.211846 4776 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.211857 4776 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.307276 4776 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.308394 4776 status_manager.go:851] "Failed to get status for pod" podUID="95d1a963-b856-4c93-b148-c90d7fd98582" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.310821 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 28 06:52:49 crc kubenswrapper[4776]: E0128 06:52:49.873164 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="3.2s" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.932790 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.934049 4776 scope.go:117] "RemoveContainer" containerID="c2b1f771b5dea98c61f6affa0c5fabc211fba22627c2c44b77802a6114621eea" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.934097 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.935523 4776 status_manager.go:851] "Failed to get status for pod" podUID="95d1a963-b856-4c93-b148-c90d7fd98582" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.935998 4776 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.938499 4776 status_manager.go:851] "Failed to get status for pod" podUID="95d1a963-b856-4c93-b148-c90d7fd98582" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.939389 4776 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.957389 4776 scope.go:117] "RemoveContainer" containerID="0ef202f53754566d05dc7073dadeb0dcbab561efccefeafadb095ef203c19ef6" Jan 28 06:52:49 crc kubenswrapper[4776]: I0128 06:52:49.983469 4776 scope.go:117] "RemoveContainer" containerID="d3d32d093f1347ed338fc0d40960a25bcdaf6347b5feed6e02c95f2437401dba" Jan 28 06:52:50 crc kubenswrapper[4776]: I0128 06:52:50.000153 4776 scope.go:117] "RemoveContainer" containerID="d9f2c6eb9d8936b3bb279c85c7367605abf1f7fcfb92ed3805b0e8da6885ddef" Jan 28 06:52:50 crc kubenswrapper[4776]: I0128 06:52:50.019820 4776 scope.go:117] "RemoveContainer" containerID="6828977e02a6670032ddbcd4018abad92282abf16d9129d7f81619756c152b36" Jan 28 06:52:50 crc kubenswrapper[4776]: I0128 06:52:50.036725 4776 scope.go:117] "RemoveContainer" containerID="b995f720a687e7da61e6b7bcf1a8164bf54afd5abd814be47be98ae190d00133" Jan 28 06:52:51 crc kubenswrapper[4776]: E0128 06:52:51.289764 4776 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.195:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188ed283a3b4a225 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-28 06:52:46.432256549 +0000 UTC m=+137.847916719,LastTimestamp:2026-01-28 06:52:46.432256549 +0000 UTC m=+137.847916719,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.315850 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" podUID="43ce9486-c553-4d64-92fb-20402352c29f" containerName="oauth-openshift" containerID="cri-o://b155a9d1e1540e3f15151994429b4067fbce04fef67bbcb887c8cf9b700efaa5" gracePeriod=15 Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.830929 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.832478 4776 status_manager.go:851] "Failed to get status for pod" podUID="95d1a963-b856-4c93-b148-c90d7fd98582" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.832805 4776 status_manager.go:851] "Failed to get status for pod" podUID="43ce9486-c553-4d64-92fb-20402352c29f" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wjm9m\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.865476 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-service-ca\") pod \"43ce9486-c553-4d64-92fb-20402352c29f\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.865526 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-provider-selection\") pod \"43ce9486-c553-4d64-92fb-20402352c29f\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.865601 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-cliconfig\") pod \"43ce9486-c553-4d64-92fb-20402352c29f\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.865636 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-router-certs\") pod \"43ce9486-c553-4d64-92fb-20402352c29f\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.865686 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-audit-policies\") pod \"43ce9486-c553-4d64-92fb-20402352c29f\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.865707 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-idp-0-file-data\") pod \"43ce9486-c553-4d64-92fb-20402352c29f\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.865730 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-login\") pod \"43ce9486-c553-4d64-92fb-20402352c29f\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.865749 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-trusted-ca-bundle\") pod \"43ce9486-c553-4d64-92fb-20402352c29f\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.865773 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-ocp-branding-template\") pod \"43ce9486-c553-4d64-92fb-20402352c29f\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.865798 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-session\") pod \"43ce9486-c553-4d64-92fb-20402352c29f\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.865815 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/43ce9486-c553-4d64-92fb-20402352c29f-audit-dir\") pod \"43ce9486-c553-4d64-92fb-20402352c29f\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.865835 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-error\") pod \"43ce9486-c553-4d64-92fb-20402352c29f\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.865856 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74ck7\" (UniqueName: \"kubernetes.io/projected/43ce9486-c553-4d64-92fb-20402352c29f-kube-api-access-74ck7\") pod \"43ce9486-c553-4d64-92fb-20402352c29f\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.865885 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-serving-cert\") pod \"43ce9486-c553-4d64-92fb-20402352c29f\" (UID: \"43ce9486-c553-4d64-92fb-20402352c29f\") " Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.866525 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "43ce9486-c553-4d64-92fb-20402352c29f" (UID: "43ce9486-c553-4d64-92fb-20402352c29f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.866604 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43ce9486-c553-4d64-92fb-20402352c29f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "43ce9486-c553-4d64-92fb-20402352c29f" (UID: "43ce9486-c553-4d64-92fb-20402352c29f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.867724 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "43ce9486-c553-4d64-92fb-20402352c29f" (UID: "43ce9486-c553-4d64-92fb-20402352c29f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.867947 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "43ce9486-c553-4d64-92fb-20402352c29f" (UID: "43ce9486-c553-4d64-92fb-20402352c29f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.868119 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "43ce9486-c553-4d64-92fb-20402352c29f" (UID: "43ce9486-c553-4d64-92fb-20402352c29f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.873255 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "43ce9486-c553-4d64-92fb-20402352c29f" (UID: "43ce9486-c553-4d64-92fb-20402352c29f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.874302 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "43ce9486-c553-4d64-92fb-20402352c29f" (UID: "43ce9486-c553-4d64-92fb-20402352c29f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.874428 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ce9486-c553-4d64-92fb-20402352c29f-kube-api-access-74ck7" (OuterVolumeSpecName: "kube-api-access-74ck7") pod "43ce9486-c553-4d64-92fb-20402352c29f" (UID: "43ce9486-c553-4d64-92fb-20402352c29f"). InnerVolumeSpecName "kube-api-access-74ck7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.874851 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "43ce9486-c553-4d64-92fb-20402352c29f" (UID: "43ce9486-c553-4d64-92fb-20402352c29f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.874974 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "43ce9486-c553-4d64-92fb-20402352c29f" (UID: "43ce9486-c553-4d64-92fb-20402352c29f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.874982 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "43ce9486-c553-4d64-92fb-20402352c29f" (UID: "43ce9486-c553-4d64-92fb-20402352c29f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.875126 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "43ce9486-c553-4d64-92fb-20402352c29f" (UID: "43ce9486-c553-4d64-92fb-20402352c29f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.875714 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "43ce9486-c553-4d64-92fb-20402352c29f" (UID: "43ce9486-c553-4d64-92fb-20402352c29f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.876230 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "43ce9486-c553-4d64-92fb-20402352c29f" (UID: "43ce9486-c553-4d64-92fb-20402352c29f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.957280 4776 generic.go:334] "Generic (PLEG): container finished" podID="43ce9486-c553-4d64-92fb-20402352c29f" containerID="b155a9d1e1540e3f15151994429b4067fbce04fef67bbcb887c8cf9b700efaa5" exitCode=0 Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.957344 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" event={"ID":"43ce9486-c553-4d64-92fb-20402352c29f","Type":"ContainerDied","Data":"b155a9d1e1540e3f15151994429b4067fbce04fef67bbcb887c8cf9b700efaa5"} Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.957384 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" event={"ID":"43ce9486-c553-4d64-92fb-20402352c29f","Type":"ContainerDied","Data":"65da0370bc7ab2d5f76776b32337aae98c4cb7240ae6df67f53650d395b5c3bc"} Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.957374 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.957403 4776 scope.go:117] "RemoveContainer" containerID="b155a9d1e1540e3f15151994429b4067fbce04fef67bbcb887c8cf9b700efaa5" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.958314 4776 status_manager.go:851] "Failed to get status for pod" podUID="95d1a963-b856-4c93-b148-c90d7fd98582" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.958775 4776 status_manager.go:851] "Failed to get status for pod" podUID="43ce9486-c553-4d64-92fb-20402352c29f" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wjm9m\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.966812 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.966847 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.966861 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.966878 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.966893 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.966908 4776 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.966920 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.966932 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.966940 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.966949 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.966958 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.966967 4776 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/43ce9486-c553-4d64-92fb-20402352c29f-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.967024 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/43ce9486-c553-4d64-92fb-20402352c29f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.967036 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74ck7\" (UniqueName: \"kubernetes.io/projected/43ce9486-c553-4d64-92fb-20402352c29f-kube-api-access-74ck7\") on node \"crc\" DevicePath \"\"" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.972869 4776 status_manager.go:851] "Failed to get status for pod" podUID="43ce9486-c553-4d64-92fb-20402352c29f" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wjm9m\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.973449 4776 status_manager.go:851] "Failed to get status for pod" podUID="95d1a963-b856-4c93-b148-c90d7fd98582" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.977341 4776 scope.go:117] "RemoveContainer" containerID="b155a9d1e1540e3f15151994429b4067fbce04fef67bbcb887c8cf9b700efaa5" Jan 28 06:52:52 crc kubenswrapper[4776]: E0128 06:52:52.978173 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b155a9d1e1540e3f15151994429b4067fbce04fef67bbcb887c8cf9b700efaa5\": container with ID starting with b155a9d1e1540e3f15151994429b4067fbce04fef67bbcb887c8cf9b700efaa5 not found: ID does not exist" containerID="b155a9d1e1540e3f15151994429b4067fbce04fef67bbcb887c8cf9b700efaa5" Jan 28 06:52:52 crc kubenswrapper[4776]: I0128 06:52:52.978263 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b155a9d1e1540e3f15151994429b4067fbce04fef67bbcb887c8cf9b700efaa5"} err="failed to get container status \"b155a9d1e1540e3f15151994429b4067fbce04fef67bbcb887c8cf9b700efaa5\": rpc error: code = NotFound desc = could not find container \"b155a9d1e1540e3f15151994429b4067fbce04fef67bbcb887c8cf9b700efaa5\": container with ID starting with b155a9d1e1540e3f15151994429b4067fbce04fef67bbcb887c8cf9b700efaa5 not found: ID does not exist" Jan 28 06:52:53 crc kubenswrapper[4776]: E0128 06:52:53.075317 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="6.4s" Jan 28 06:52:57 crc kubenswrapper[4776]: I0128 06:52:57.304498 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:52:57 crc kubenswrapper[4776]: I0128 06:52:57.306166 4776 status_manager.go:851] "Failed to get status for pod" podUID="95d1a963-b856-4c93-b148-c90d7fd98582" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:57 crc kubenswrapper[4776]: I0128 06:52:57.306597 4776 status_manager.go:851] "Failed to get status for pod" podUID="43ce9486-c553-4d64-92fb-20402352c29f" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wjm9m\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:57 crc kubenswrapper[4776]: I0128 06:52:57.329849 4776 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9d9ff603-fc42-4716-bfad-5dba64a2d188" Jan 28 06:52:57 crc kubenswrapper[4776]: I0128 06:52:57.329915 4776 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9d9ff603-fc42-4716-bfad-5dba64a2d188" Jan 28 06:52:57 crc kubenswrapper[4776]: E0128 06:52:57.330530 4776 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:52:57 crc kubenswrapper[4776]: I0128 06:52:57.331202 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:52:58 crc kubenswrapper[4776]: I0128 06:52:58.004800 4776 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="82862ca929d45329756f1ec58c0ecca8e2b9959bd774fcee7c9b24666355f3e8" exitCode=0 Jan 28 06:52:58 crc kubenswrapper[4776]: I0128 06:52:58.004908 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"82862ca929d45329756f1ec58c0ecca8e2b9959bd774fcee7c9b24666355f3e8"} Jan 28 06:52:58 crc kubenswrapper[4776]: I0128 06:52:58.005146 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"63dbb56bfc4950cbd5c8c34a5fee3d3671c52804f12791eddbec351fce991f26"} Jan 28 06:52:58 crc kubenswrapper[4776]: I0128 06:52:58.005471 4776 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9d9ff603-fc42-4716-bfad-5dba64a2d188" Jan 28 06:52:58 crc kubenswrapper[4776]: I0128 06:52:58.005489 4776 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9d9ff603-fc42-4716-bfad-5dba64a2d188" Jan 28 06:52:58 crc kubenswrapper[4776]: E0128 06:52:58.005928 4776 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:52:58 crc kubenswrapper[4776]: I0128 06:52:58.006134 4776 status_manager.go:851] "Failed to get status for pod" podUID="95d1a963-b856-4c93-b148-c90d7fd98582" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:58 crc kubenswrapper[4776]: I0128 06:52:58.006699 4776 status_manager.go:851] "Failed to get status for pod" podUID="43ce9486-c553-4d64-92fb-20402352c29f" pod="openshift-authentication/oauth-openshift-558db77b4-wjm9m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-wjm9m\": dial tcp 38.102.83.195:6443: connect: connection refused" Jan 28 06:52:59 crc kubenswrapper[4776]: I0128 06:52:59.016229 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 28 06:52:59 crc kubenswrapper[4776]: I0128 06:52:59.016719 4776 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc" exitCode=1 Jan 28 06:52:59 crc kubenswrapper[4776]: I0128 06:52:59.016793 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc"} Jan 28 06:52:59 crc kubenswrapper[4776]: I0128 06:52:59.017416 4776 scope.go:117] "RemoveContainer" containerID="bfd2228cf32dad410cc20b9ad466d4eabb3eb4270bbe60af8fe81a7297a623dc" Jan 28 06:52:59 crc kubenswrapper[4776]: I0128 06:52:59.022974 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3c5bd3f321dffc61773428aeab3637f78edbc728e847af0e29c60d4996925b5a"} Jan 28 06:52:59 crc kubenswrapper[4776]: I0128 06:52:59.023027 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c4ef28bdc4a96071b59d5d785dcd7b1637b48c9f65b59632e03a39cec83924a0"} Jan 28 06:52:59 crc kubenswrapper[4776]: I0128 06:52:59.023040 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"eb563a773e421dfe581883fbaa7defa69de0770efef738a15e4d089522bad5f9"} Jan 28 06:52:59 crc kubenswrapper[4776]: I0128 06:52:59.023049 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a2579153d545fdd99480d6d4618bd4d641f6964255f971b48953c496299db643"} Jan 28 06:53:00 crc kubenswrapper[4776]: I0128 06:53:00.031500 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0d46e897760451dbd561962012d2cd8260f9a9673f644a1554fed36aa64e2da8"} Jan 28 06:53:00 crc kubenswrapper[4776]: I0128 06:53:00.032151 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:53:00 crc kubenswrapper[4776]: I0128 06:53:00.032829 4776 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9d9ff603-fc42-4716-bfad-5dba64a2d188" Jan 28 06:53:00 crc kubenswrapper[4776]: I0128 06:53:00.033217 4776 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9d9ff603-fc42-4716-bfad-5dba64a2d188" Jan 28 06:53:00 crc kubenswrapper[4776]: I0128 06:53:00.035001 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 28 06:53:00 crc kubenswrapper[4776]: I0128 06:53:00.035061 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a6a38b7e56f0c4cf94282661f8f86451b4d642ec3ebe62ed4a486b557aa3cef0"} Jan 28 06:53:00 crc kubenswrapper[4776]: I0128 06:53:00.413737 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:53:02 crc kubenswrapper[4776]: I0128 06:53:02.331713 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:53:02 crc kubenswrapper[4776]: I0128 06:53:02.332214 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:53:02 crc kubenswrapper[4776]: I0128 06:53:02.338106 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:53:03 crc kubenswrapper[4776]: I0128 06:53:03.851900 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 06:53:03 crc kubenswrapper[4776]: I0128 06:53:03.852660 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 06:53:05 crc kubenswrapper[4776]: I0128 06:53:05.059811 4776 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:53:06 crc kubenswrapper[4776]: I0128 06:53:06.082617 4776 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9d9ff603-fc42-4716-bfad-5dba64a2d188" Jan 28 06:53:06 crc kubenswrapper[4776]: I0128 06:53:06.084727 4776 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9d9ff603-fc42-4716-bfad-5dba64a2d188" Jan 28 06:53:06 crc kubenswrapper[4776]: I0128 06:53:06.089514 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:53:06 crc kubenswrapper[4776]: I0128 06:53:06.094689 4776 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d37fe95c-7859-4693-8dbc-8e6fb1f7a040" Jan 28 06:53:07 crc kubenswrapper[4776]: I0128 06:53:07.089197 4776 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9d9ff603-fc42-4716-bfad-5dba64a2d188" Jan 28 06:53:07 crc kubenswrapper[4776]: I0128 06:53:07.090177 4776 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9d9ff603-fc42-4716-bfad-5dba64a2d188" Jan 28 06:53:08 crc kubenswrapper[4776]: I0128 06:53:08.734004 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:53:08 crc kubenswrapper[4776]: I0128 06:53:08.737938 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:53:09 crc kubenswrapper[4776]: I0128 06:53:09.123710 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 06:53:09 crc kubenswrapper[4776]: I0128 06:53:09.318851 4776 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d37fe95c-7859-4693-8dbc-8e6fb1f7a040" Jan 28 06:53:11 crc kubenswrapper[4776]: I0128 06:53:11.595406 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 28 06:53:11 crc kubenswrapper[4776]: I0128 06:53:11.853099 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 28 06:53:12 crc kubenswrapper[4776]: I0128 06:53:12.942265 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 28 06:53:14 crc kubenswrapper[4776]: I0128 06:53:14.501022 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 28 06:53:15 crc kubenswrapper[4776]: I0128 06:53:15.514745 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 28 06:53:16 crc kubenswrapper[4776]: I0128 06:53:16.006598 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 28 06:53:16 crc kubenswrapper[4776]: I0128 06:53:16.571891 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 28 06:53:16 crc kubenswrapper[4776]: I0128 06:53:16.760568 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 28 06:53:16 crc kubenswrapper[4776]: I0128 06:53:16.841017 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 28 06:53:16 crc kubenswrapper[4776]: I0128 06:53:16.914719 4776 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 28 06:53:17 crc kubenswrapper[4776]: I0128 06:53:17.016949 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 28 06:53:17 crc kubenswrapper[4776]: I0128 06:53:17.035629 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 28 06:53:17 crc kubenswrapper[4776]: I0128 06:53:17.185395 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 28 06:53:17 crc kubenswrapper[4776]: I0128 06:53:17.272700 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 28 06:53:17 crc kubenswrapper[4776]: I0128 06:53:17.317976 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 28 06:53:17 crc kubenswrapper[4776]: I0128 06:53:17.378263 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 28 06:53:17 crc kubenswrapper[4776]: I0128 06:53:17.669118 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 28 06:53:18 crc kubenswrapper[4776]: I0128 06:53:18.128466 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 28 06:53:18 crc kubenswrapper[4776]: I0128 06:53:18.159654 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 28 06:53:18 crc kubenswrapper[4776]: I0128 06:53:18.199503 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 28 06:53:18 crc kubenswrapper[4776]: I0128 06:53:18.230351 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 28 06:53:18 crc kubenswrapper[4776]: I0128 06:53:18.446467 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 28 06:53:18 crc kubenswrapper[4776]: I0128 06:53:18.465048 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 28 06:53:18 crc kubenswrapper[4776]: I0128 06:53:18.555173 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 28 06:53:18 crc kubenswrapper[4776]: I0128 06:53:18.555963 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 28 06:53:18 crc kubenswrapper[4776]: I0128 06:53:18.688656 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 28 06:53:18 crc kubenswrapper[4776]: I0128 06:53:18.853291 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 28 06:53:18 crc kubenswrapper[4776]: I0128 06:53:18.974487 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 28 06:53:19 crc kubenswrapper[4776]: I0128 06:53:19.068122 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 28 06:53:19 crc kubenswrapper[4776]: I0128 06:53:19.145434 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 28 06:53:19 crc kubenswrapper[4776]: I0128 06:53:19.156714 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 28 06:53:19 crc kubenswrapper[4776]: I0128 06:53:19.225590 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 28 06:53:19 crc kubenswrapper[4776]: I0128 06:53:19.274874 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 28 06:53:19 crc kubenswrapper[4776]: I0128 06:53:19.324789 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 28 06:53:19 crc kubenswrapper[4776]: I0128 06:53:19.411033 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 28 06:53:19 crc kubenswrapper[4776]: I0128 06:53:19.555091 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 28 06:53:19 crc kubenswrapper[4776]: I0128 06:53:19.649239 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 28 06:53:19 crc kubenswrapper[4776]: I0128 06:53:19.663424 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 28 06:53:19 crc kubenswrapper[4776]: I0128 06:53:19.729487 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 28 06:53:19 crc kubenswrapper[4776]: I0128 06:53:19.844378 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 28 06:53:19 crc kubenswrapper[4776]: I0128 06:53:19.849595 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 28 06:53:19 crc kubenswrapper[4776]: I0128 06:53:19.861597 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 28 06:53:20 crc kubenswrapper[4776]: I0128 06:53:20.023067 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 28 06:53:20 crc kubenswrapper[4776]: I0128 06:53:20.141075 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 28 06:53:20 crc kubenswrapper[4776]: I0128 06:53:20.201659 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 28 06:53:20 crc kubenswrapper[4776]: I0128 06:53:20.222360 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 28 06:53:20 crc kubenswrapper[4776]: I0128 06:53:20.230763 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 28 06:53:20 crc kubenswrapper[4776]: I0128 06:53:20.562319 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 28 06:53:20 crc kubenswrapper[4776]: I0128 06:53:20.586304 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 28 06:53:20 crc kubenswrapper[4776]: I0128 06:53:20.615085 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 28 06:53:20 crc kubenswrapper[4776]: I0128 06:53:20.750865 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 28 06:53:20 crc kubenswrapper[4776]: I0128 06:53:20.791088 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 28 06:53:20 crc kubenswrapper[4776]: I0128 06:53:20.820821 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 28 06:53:20 crc kubenswrapper[4776]: I0128 06:53:20.923577 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 28 06:53:20 crc kubenswrapper[4776]: I0128 06:53:20.943336 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 28 06:53:20 crc kubenswrapper[4776]: I0128 06:53:20.948155 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 28 06:53:21 crc kubenswrapper[4776]: I0128 06:53:21.126979 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 28 06:53:21 crc kubenswrapper[4776]: I0128 06:53:21.198333 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 28 06:53:21 crc kubenswrapper[4776]: I0128 06:53:21.342984 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 28 06:53:21 crc kubenswrapper[4776]: I0128 06:53:21.393500 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 28 06:53:21 crc kubenswrapper[4776]: I0128 06:53:21.603346 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 28 06:53:21 crc kubenswrapper[4776]: I0128 06:53:21.639371 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 28 06:53:21 crc kubenswrapper[4776]: I0128 06:53:21.763111 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 28 06:53:21 crc kubenswrapper[4776]: I0128 06:53:21.965010 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.004014 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.030625 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.032872 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.069803 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.092162 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.107858 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.189586 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.260826 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.317090 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.371154 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.388432 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.476670 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.525926 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.576647 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.597873 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.601301 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.649461 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.695145 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.807074 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.860137 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.909588 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 28 06:53:22 crc kubenswrapper[4776]: I0128 06:53:22.980216 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 28 06:53:23 crc kubenswrapper[4776]: I0128 06:53:23.019564 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 28 06:53:23 crc kubenswrapper[4776]: I0128 06:53:23.030796 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 28 06:53:23 crc kubenswrapper[4776]: I0128 06:53:23.121949 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 28 06:53:23 crc kubenswrapper[4776]: I0128 06:53:23.181655 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 28 06:53:23 crc kubenswrapper[4776]: I0128 06:53:23.319869 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 28 06:53:23 crc kubenswrapper[4776]: I0128 06:53:23.440619 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 28 06:53:23 crc kubenswrapper[4776]: I0128 06:53:23.448206 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 28 06:53:23 crc kubenswrapper[4776]: I0128 06:53:23.661481 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 28 06:53:23 crc kubenswrapper[4776]: I0128 06:53:23.746808 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 28 06:53:23 crc kubenswrapper[4776]: I0128 06:53:23.809208 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 06:53:23 crc kubenswrapper[4776]: I0128 06:53:23.871980 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 28 06:53:23 crc kubenswrapper[4776]: I0128 06:53:23.872157 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 28 06:53:23 crc kubenswrapper[4776]: I0128 06:53:23.952164 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 28 06:53:23 crc kubenswrapper[4776]: I0128 06:53:23.974244 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 28 06:53:23 crc kubenswrapper[4776]: I0128 06:53:23.986243 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 28 06:53:24 crc kubenswrapper[4776]: I0128 06:53:24.032708 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 28 06:53:24 crc kubenswrapper[4776]: I0128 06:53:24.053203 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 28 06:53:24 crc kubenswrapper[4776]: I0128 06:53:24.078670 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 28 06:53:24 crc kubenswrapper[4776]: I0128 06:53:24.196288 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 28 06:53:24 crc kubenswrapper[4776]: I0128 06:53:24.235028 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 28 06:53:24 crc kubenswrapper[4776]: I0128 06:53:24.338335 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 28 06:53:24 crc kubenswrapper[4776]: I0128 06:53:24.374772 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 28 06:53:24 crc kubenswrapper[4776]: I0128 06:53:24.415309 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 28 06:53:24 crc kubenswrapper[4776]: I0128 06:53:24.493869 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 28 06:53:24 crc kubenswrapper[4776]: I0128 06:53:24.520158 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 28 06:53:24 crc kubenswrapper[4776]: I0128 06:53:24.555097 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 28 06:53:24 crc kubenswrapper[4776]: I0128 06:53:24.604437 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 28 06:53:24 crc kubenswrapper[4776]: I0128 06:53:24.606199 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 28 06:53:24 crc kubenswrapper[4776]: I0128 06:53:24.677191 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 28 06:53:24 crc kubenswrapper[4776]: I0128 06:53:24.813661 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 28 06:53:24 crc kubenswrapper[4776]: I0128 06:53:24.882195 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.008240 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.106601 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.112222 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.135875 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.177020 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.308157 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.319585 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.364282 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.379231 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.472742 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.486131 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.575249 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.611848 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.687292 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.704171 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.714619 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.733442 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 28 06:53:25 crc kubenswrapper[4776]: I0128 06:53:25.872988 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.002042 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.003140 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.035839 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.087260 4776 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.186027 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.209127 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.297992 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.362001 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.410387 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.415618 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.417916 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.425400 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.465418 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.480466 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.481262 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.492120 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.499820 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.504934 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.529580 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.574377 4776 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.580172 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wjm9m","openshift-kube-apiserver/kube-apiserver-crc"] Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.580243 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.587049 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.608628 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.608602345 podStartE2EDuration="21.608602345s" podCreationTimestamp="2026-01-28 06:53:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:53:26.608291837 +0000 UTC m=+178.023952037" watchObservedRunningTime="2026-01-28 06:53:26.608602345 +0000 UTC m=+178.024262505" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.639063 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.659703 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.710969 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.728315 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.738010 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.744245 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.849829 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.855022 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 28 06:53:26 crc kubenswrapper[4776]: I0128 06:53:26.856826 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.055311 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.096310 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.202947 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.254466 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.269027 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.313485 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43ce9486-c553-4d64-92fb-20402352c29f" path="/var/lib/kubelet/pods/43ce9486-c553-4d64-92fb-20402352c29f/volumes" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.328337 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.351424 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.368845 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.441350 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.488791 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.527048 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.620288 4776 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.620724 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://9e90520a82f3c59bfb301a8dde330e44eeb745a236261d58ef42ef815cf62852" gracePeriod=5 Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.637433 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.649827 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.719793 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.744602 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.892029 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 28 06:53:27 crc kubenswrapper[4776]: I0128 06:53:27.942793 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 06:53:28 crc kubenswrapper[4776]: I0128 06:53:28.357438 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 28 06:53:28 crc kubenswrapper[4776]: I0128 06:53:28.424310 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 28 06:53:28 crc kubenswrapper[4776]: I0128 06:53:28.487643 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 28 06:53:28 crc kubenswrapper[4776]: I0128 06:53:28.533138 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 28 06:53:28 crc kubenswrapper[4776]: I0128 06:53:28.606680 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 28 06:53:28 crc kubenswrapper[4776]: I0128 06:53:28.634235 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 28 06:53:28 crc kubenswrapper[4776]: I0128 06:53:28.763302 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 28 06:53:28 crc kubenswrapper[4776]: I0128 06:53:28.819459 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 28 06:53:29 crc kubenswrapper[4776]: I0128 06:53:29.118997 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 28 06:53:29 crc kubenswrapper[4776]: I0128 06:53:29.120281 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 28 06:53:29 crc kubenswrapper[4776]: I0128 06:53:29.185042 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 28 06:53:29 crc kubenswrapper[4776]: I0128 06:53:29.417937 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 28 06:53:29 crc kubenswrapper[4776]: I0128 06:53:29.470486 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 28 06:53:29 crc kubenswrapper[4776]: I0128 06:53:29.597248 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 28 06:53:29 crc kubenswrapper[4776]: I0128 06:53:29.608342 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 06:53:29 crc kubenswrapper[4776]: I0128 06:53:29.681210 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 28 06:53:29 crc kubenswrapper[4776]: I0128 06:53:29.734202 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 28 06:53:29 crc kubenswrapper[4776]: I0128 06:53:29.816676 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 28 06:53:29 crc kubenswrapper[4776]: I0128 06:53:29.838454 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 28 06:53:29 crc kubenswrapper[4776]: I0128 06:53:29.917202 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 28 06:53:29 crc kubenswrapper[4776]: I0128 06:53:29.948189 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 28 06:53:30 crc kubenswrapper[4776]: I0128 06:53:30.046057 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 28 06:53:30 crc kubenswrapper[4776]: I0128 06:53:30.189805 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 28 06:53:30 crc kubenswrapper[4776]: I0128 06:53:30.242386 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 28 06:53:30 crc kubenswrapper[4776]: I0128 06:53:30.280309 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 28 06:53:30 crc kubenswrapper[4776]: I0128 06:53:30.292814 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 28 06:53:30 crc kubenswrapper[4776]: I0128 06:53:30.310825 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 28 06:53:30 crc kubenswrapper[4776]: I0128 06:53:30.515818 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 28 06:53:30 crc kubenswrapper[4776]: I0128 06:53:30.570418 4776 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 28 06:53:30 crc kubenswrapper[4776]: I0128 06:53:30.586805 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 28 06:53:30 crc kubenswrapper[4776]: I0128 06:53:30.635033 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 28 06:53:30 crc kubenswrapper[4776]: I0128 06:53:30.794395 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 06:53:30 crc kubenswrapper[4776]: I0128 06:53:30.796180 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 28 06:53:31 crc kubenswrapper[4776]: I0128 06:53:31.015387 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 28 06:53:31 crc kubenswrapper[4776]: I0128 06:53:31.035266 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 28 06:53:31 crc kubenswrapper[4776]: I0128 06:53:31.093627 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 28 06:53:31 crc kubenswrapper[4776]: I0128 06:53:31.158865 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 28 06:53:31 crc kubenswrapper[4776]: I0128 06:53:31.195889 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 28 06:53:31 crc kubenswrapper[4776]: I0128 06:53:31.403761 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 28 06:53:31 crc kubenswrapper[4776]: I0128 06:53:31.541843 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 28 06:53:31 crc kubenswrapper[4776]: I0128 06:53:31.606038 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 28 06:53:31 crc kubenswrapper[4776]: I0128 06:53:31.677260 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 28 06:53:31 crc kubenswrapper[4776]: I0128 06:53:31.697595 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 28 06:53:31 crc kubenswrapper[4776]: I0128 06:53:31.843523 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 28 06:53:32 crc kubenswrapper[4776]: I0128 06:53:32.061468 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 28 06:53:32 crc kubenswrapper[4776]: I0128 06:53:32.358858 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 28 06:53:32 crc kubenswrapper[4776]: I0128 06:53:32.488437 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 28 06:53:32 crc kubenswrapper[4776]: I0128 06:53:32.567652 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 28 06:53:32 crc kubenswrapper[4776]: I0128 06:53:32.722161 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 28 06:53:32 crc kubenswrapper[4776]: E0128 06:53:32.747951 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-9e90520a82f3c59bfb301a8dde330e44eeb745a236261d58ef42ef815cf62852.scope\": RecentStats: unable to find data in memory cache]" Jan 28 06:53:32 crc kubenswrapper[4776]: I0128 06:53:32.885517 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.231905 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.232015 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.267484 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.267620 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.267725 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.267826 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.267959 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.268018 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.268063 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.268207 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.268255 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.268784 4776 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.268821 4776 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.268846 4776 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.268870 4776 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.274260 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.283261 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.283426 4776 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="9e90520a82f3c59bfb301a8dde330e44eeb745a236261d58ef42ef815cf62852" exitCode=137 Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.283484 4776 scope.go:117] "RemoveContainer" containerID="9e90520a82f3c59bfb301a8dde330e44eeb745a236261d58ef42ef815cf62852" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.283693 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.285537 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.316625 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.330787 4776 scope.go:117] "RemoveContainer" containerID="9e90520a82f3c59bfb301a8dde330e44eeb745a236261d58ef42ef815cf62852" Jan 28 06:53:33 crc kubenswrapper[4776]: E0128 06:53:33.331403 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e90520a82f3c59bfb301a8dde330e44eeb745a236261d58ef42ef815cf62852\": container with ID starting with 9e90520a82f3c59bfb301a8dde330e44eeb745a236261d58ef42ef815cf62852 not found: ID does not exist" containerID="9e90520a82f3c59bfb301a8dde330e44eeb745a236261d58ef42ef815cf62852" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.331595 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e90520a82f3c59bfb301a8dde330e44eeb745a236261d58ef42ef815cf62852"} err="failed to get container status \"9e90520a82f3c59bfb301a8dde330e44eeb745a236261d58ef42ef815cf62852\": rpc error: code = NotFound desc = could not find container \"9e90520a82f3c59bfb301a8dde330e44eeb745a236261d58ef42ef815cf62852\": container with ID starting with 9e90520a82f3c59bfb301a8dde330e44eeb745a236261d58ef42ef815cf62852 not found: ID does not exist" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.369830 4776 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.853278 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 06:53:33 crc kubenswrapper[4776]: I0128 06:53:33.853395 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 06:53:34 crc kubenswrapper[4776]: I0128 06:53:34.484236 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 28 06:53:34 crc kubenswrapper[4776]: I0128 06:53:34.677190 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 06:53:35 crc kubenswrapper[4776]: I0128 06:53:35.660272 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 28 06:53:35 crc kubenswrapper[4776]: I0128 06:53:35.992508 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-59cf6c497f-w62js"] Jan 28 06:53:35 crc kubenswrapper[4776]: E0128 06:53:35.992807 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 28 06:53:35 crc kubenswrapper[4776]: I0128 06:53:35.992824 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 28 06:53:35 crc kubenswrapper[4776]: E0128 06:53:35.992843 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d1a963-b856-4c93-b148-c90d7fd98582" containerName="installer" Jan 28 06:53:35 crc kubenswrapper[4776]: I0128 06:53:35.992851 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d1a963-b856-4c93-b148-c90d7fd98582" containerName="installer" Jan 28 06:53:35 crc kubenswrapper[4776]: E0128 06:53:35.992863 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ce9486-c553-4d64-92fb-20402352c29f" containerName="oauth-openshift" Jan 28 06:53:35 crc kubenswrapper[4776]: I0128 06:53:35.992870 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ce9486-c553-4d64-92fb-20402352c29f" containerName="oauth-openshift" Jan 28 06:53:35 crc kubenswrapper[4776]: I0128 06:53:35.992976 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 28 06:53:35 crc kubenswrapper[4776]: I0128 06:53:35.992992 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="43ce9486-c553-4d64-92fb-20402352c29f" containerName="oauth-openshift" Jan 28 06:53:35 crc kubenswrapper[4776]: I0128 06:53:35.993005 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d1a963-b856-4c93-b148-c90d7fd98582" containerName="installer" Jan 28 06:53:35 crc kubenswrapper[4776]: I0128 06:53:35.993485 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.003046 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.003201 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.003301 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.003301 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.003585 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.003683 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.004139 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.004169 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.004511 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.004659 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.005346 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.007719 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.009253 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-59cf6c497f-w62js"] Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.010122 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.014581 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.017752 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.105540 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/503d72d3-32f6-4dd9-a4e3-94db256b0594-audit-dir\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.105616 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.105655 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-cliconfig\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.105674 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/503d72d3-32f6-4dd9-a4e3-94db256b0594-audit-policies\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.105698 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-router-certs\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.105741 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-session\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.105762 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxb9l\" (UniqueName: \"kubernetes.io/projected/503d72d3-32f6-4dd9-a4e3-94db256b0594-kube-api-access-pxb9l\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.105792 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-service-ca\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.105821 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-serving-cert\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.105838 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.105853 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-user-template-login\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.105875 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-user-template-error\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.105902 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.105920 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.207439 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.207492 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.207528 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/503d72d3-32f6-4dd9-a4e3-94db256b0594-audit-dir\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.207566 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.207598 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-cliconfig\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.207620 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/503d72d3-32f6-4dd9-a4e3-94db256b0594-audit-policies\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.207643 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-router-certs\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.207677 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-session\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.207695 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxb9l\" (UniqueName: \"kubernetes.io/projected/503d72d3-32f6-4dd9-a4e3-94db256b0594-kube-api-access-pxb9l\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.207719 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-service-ca\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.207718 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/503d72d3-32f6-4dd9-a4e3-94db256b0594-audit-dir\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.207745 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-serving-cert\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.207852 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.207893 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-user-template-login\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.207913 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-user-template-error\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.208761 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-cliconfig\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.209307 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.209797 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/503d72d3-32f6-4dd9-a4e3-94db256b0594-audit-policies\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.210037 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-service-ca\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.212039 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-user-template-error\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.212226 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.212691 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.213348 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-router-certs\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.213921 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-serving-cert\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.213956 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-system-session\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.214779 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-user-template-login\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.218863 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/503d72d3-32f6-4dd9-a4e3-94db256b0594-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.237222 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxb9l\" (UniqueName: \"kubernetes.io/projected/503d72d3-32f6-4dd9-a4e3-94db256b0594-kube-api-access-pxb9l\") pod \"oauth-openshift-59cf6c497f-w62js\" (UID: \"503d72d3-32f6-4dd9-a4e3-94db256b0594\") " pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.322163 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:36 crc kubenswrapper[4776]: I0128 06:53:36.829399 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-59cf6c497f-w62js"] Jan 28 06:53:36 crc kubenswrapper[4776]: W0128 06:53:36.840095 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod503d72d3_32f6_4dd9_a4e3_94db256b0594.slice/crio-4274d18ca61752ddc1351916ebaffec34220b3216932f4216026167aa8e0197b WatchSource:0}: Error finding container 4274d18ca61752ddc1351916ebaffec34220b3216932f4216026167aa8e0197b: Status 404 returned error can't find the container with id 4274d18ca61752ddc1351916ebaffec34220b3216932f4216026167aa8e0197b Jan 28 06:53:37 crc kubenswrapper[4776]: I0128 06:53:37.314369 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:37 crc kubenswrapper[4776]: I0128 06:53:37.314416 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" event={"ID":"503d72d3-32f6-4dd9-a4e3-94db256b0594","Type":"ContainerStarted","Data":"8e552ac44a721420b24cf8021cde516f57e46731d83026fdb7d73953fc807fe4"} Jan 28 06:53:37 crc kubenswrapper[4776]: I0128 06:53:37.314437 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" event={"ID":"503d72d3-32f6-4dd9-a4e3-94db256b0594","Type":"ContainerStarted","Data":"4274d18ca61752ddc1351916ebaffec34220b3216932f4216026167aa8e0197b"} Jan 28 06:53:37 crc kubenswrapper[4776]: I0128 06:53:37.529481 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" Jan 28 06:53:37 crc kubenswrapper[4776]: I0128 06:53:37.549726 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-59cf6c497f-w62js" podStartSLOduration=70.549702747 podStartE2EDuration="1m10.549702747s" podCreationTimestamp="2026-01-28 06:52:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:53:37.336688908 +0000 UTC m=+188.752349068" watchObservedRunningTime="2026-01-28 06:53:37.549702747 +0000 UTC m=+188.965362917" Jan 28 06:53:47 crc kubenswrapper[4776]: I0128 06:53:47.156939 4776 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 28 06:53:48 crc kubenswrapper[4776]: I0128 06:53:48.381130 4776 generic.go:334] "Generic (PLEG): container finished" podID="97135081-7759-4edc-aa62-514c15190115" containerID="0f38494f71fc8f23e68352e1eee1a97f2ae4839fb9f43d927bdf4a96b15e9980" exitCode=0 Jan 28 06:53:48 crc kubenswrapper[4776]: I0128 06:53:48.381185 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" event={"ID":"97135081-7759-4edc-aa62-514c15190115","Type":"ContainerDied","Data":"0f38494f71fc8f23e68352e1eee1a97f2ae4839fb9f43d927bdf4a96b15e9980"} Jan 28 06:53:48 crc kubenswrapper[4776]: I0128 06:53:48.381788 4776 scope.go:117] "RemoveContainer" containerID="0f38494f71fc8f23e68352e1eee1a97f2ae4839fb9f43d927bdf4a96b15e9980" Jan 28 06:53:48 crc kubenswrapper[4776]: I0128 06:53:48.819860 4776 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 28 06:53:49 crc kubenswrapper[4776]: I0128 06:53:49.389903 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" event={"ID":"97135081-7759-4edc-aa62-514c15190115","Type":"ContainerStarted","Data":"b40d1f94d7d6d14cd88a9bda938b62535e8639affc510a5bf41391e3270f14cc"} Jan 28 06:53:49 crc kubenswrapper[4776]: I0128 06:53:49.392185 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" Jan 28 06:53:49 crc kubenswrapper[4776]: I0128 06:53:49.397065 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" Jan 28 06:54:03 crc kubenswrapper[4776]: I0128 06:54:03.852182 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 06:54:03 crc kubenswrapper[4776]: I0128 06:54:03.853396 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 06:54:03 crc kubenswrapper[4776]: I0128 06:54:03.853495 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 06:54:03 crc kubenswrapper[4776]: I0128 06:54:03.854619 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 06:54:03 crc kubenswrapper[4776]: I0128 06:54:03.854751 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c" gracePeriod=600 Jan 28 06:54:04 crc kubenswrapper[4776]: I0128 06:54:04.489946 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c" exitCode=0 Jan 28 06:54:04 crc kubenswrapper[4776]: I0128 06:54:04.490077 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c"} Jan 28 06:54:04 crc kubenswrapper[4776]: I0128 06:54:04.490366 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"f5691c3966fa8bfeca0b7c2a14453cfb77e0738ae741fdd437e2c637ddfe5d3c"} Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.778141 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cg6fj"] Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.779651 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.791731 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cg6fj"] Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.891708 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51c014e0-724a-4270-9c32-2863c3687dfb-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.891779 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51c014e0-724a-4270-9c32-2863c3687dfb-registry-certificates\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.891811 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51c014e0-724a-4270-9c32-2863c3687dfb-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.891878 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51c014e0-724a-4270-9c32-2863c3687dfb-bound-sa-token\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.891917 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.891962 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqqzw\" (UniqueName: \"kubernetes.io/projected/51c014e0-724a-4270-9c32-2863c3687dfb-kube-api-access-zqqzw\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.891994 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51c014e0-724a-4270-9c32-2863c3687dfb-trusted-ca\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.892217 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51c014e0-724a-4270-9c32-2863c3687dfb-registry-tls\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.919360 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.993499 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51c014e0-724a-4270-9c32-2863c3687dfb-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.994016 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51c014e0-724a-4270-9c32-2863c3687dfb-registry-certificates\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.994169 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51c014e0-724a-4270-9c32-2863c3687dfb-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.994311 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51c014e0-724a-4270-9c32-2863c3687dfb-bound-sa-token\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.994511 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqqzw\" (UniqueName: \"kubernetes.io/projected/51c014e0-724a-4270-9c32-2863c3687dfb-kube-api-access-zqqzw\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.994700 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51c014e0-724a-4270-9c32-2863c3687dfb-trusted-ca\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.994819 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51c014e0-724a-4270-9c32-2863c3687dfb-registry-tls\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.994077 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51c014e0-724a-4270-9c32-2863c3687dfb-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.995519 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51c014e0-724a-4270-9c32-2863c3687dfb-registry-certificates\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:48 crc kubenswrapper[4776]: I0128 06:54:48.996146 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51c014e0-724a-4270-9c32-2863c3687dfb-trusted-ca\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:49 crc kubenswrapper[4776]: I0128 06:54:49.001622 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51c014e0-724a-4270-9c32-2863c3687dfb-registry-tls\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:49 crc kubenswrapper[4776]: I0128 06:54:49.006205 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51c014e0-724a-4270-9c32-2863c3687dfb-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:49 crc kubenswrapper[4776]: I0128 06:54:49.015882 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqqzw\" (UniqueName: \"kubernetes.io/projected/51c014e0-724a-4270-9c32-2863c3687dfb-kube-api-access-zqqzw\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:49 crc kubenswrapper[4776]: I0128 06:54:49.017471 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51c014e0-724a-4270-9c32-2863c3687dfb-bound-sa-token\") pod \"image-registry-66df7c8f76-cg6fj\" (UID: \"51c014e0-724a-4270-9c32-2863c3687dfb\") " pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:49 crc kubenswrapper[4776]: I0128 06:54:49.104477 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:49 crc kubenswrapper[4776]: I0128 06:54:49.502308 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cg6fj"] Jan 28 06:54:49 crc kubenswrapper[4776]: W0128 06:54:49.512470 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51c014e0_724a_4270_9c32_2863c3687dfb.slice/crio-0531a47139f8df6643376ace3d952585b4c77d5ce496611224bed07053980bc7 WatchSource:0}: Error finding container 0531a47139f8df6643376ace3d952585b4c77d5ce496611224bed07053980bc7: Status 404 returned error can't find the container with id 0531a47139f8df6643376ace3d952585b4c77d5ce496611224bed07053980bc7 Jan 28 06:54:49 crc kubenswrapper[4776]: I0128 06:54:49.774743 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" event={"ID":"51c014e0-724a-4270-9c32-2863c3687dfb","Type":"ContainerStarted","Data":"872ff8c5d84cede9c0f0442e8bf8a694f3d54d8f66965dfa042d018815cdd137"} Jan 28 06:54:49 crc kubenswrapper[4776]: I0128 06:54:49.775180 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" event={"ID":"51c014e0-724a-4270-9c32-2863c3687dfb","Type":"ContainerStarted","Data":"0531a47139f8df6643376ace3d952585b4c77d5ce496611224bed07053980bc7"} Jan 28 06:54:49 crc kubenswrapper[4776]: I0128 06:54:49.775203 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:54:49 crc kubenswrapper[4776]: I0128 06:54:49.793406 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" podStartSLOduration=1.793382973 podStartE2EDuration="1.793382973s" podCreationTimestamp="2026-01-28 06:54:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:54:49.792215212 +0000 UTC m=+261.207875372" watchObservedRunningTime="2026-01-28 06:54:49.793382973 +0000 UTC m=+261.209043133" Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.747154 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2f2sb"] Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.748406 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2f2sb" podUID="8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff" containerName="registry-server" containerID="cri-o://cff836b66f44f11374fe7a1347f99a4228d138ecb012710c80843194ec9cbc34" gracePeriod=30 Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.755520 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jvjv2"] Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.756893 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jvjv2" podUID="beb166aa-d9c2-4658-af43-8d5d2eb61588" containerName="registry-server" containerID="cri-o://d6e9bf6730b69ed4e4b55b84540bbf5508c9e5dad1f2e54fc1bd296178a461b8" gracePeriod=30 Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.768203 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2vn7f"] Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.768624 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" podUID="97135081-7759-4edc-aa62-514c15190115" containerName="marketplace-operator" containerID="cri-o://b40d1f94d7d6d14cd88a9bda938b62535e8639affc510a5bf41391e3270f14cc" gracePeriod=30 Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.776107 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndmtj"] Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.776368 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ndmtj" podUID="d608fa02-5844-4167-831f-c754aeca5050" containerName="registry-server" containerID="cri-o://ab17f2ecf7bb969254e3c6b2d209cd1bdbaa470323f12fb58a092f3060c9b32b" gracePeriod=30 Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.793055 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8d9l4"] Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.793339 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8d9l4" podUID="51afc3ef-b111-4228-859a-9ff98f2b5131" containerName="registry-server" containerID="cri-o://60670eb50bdaaced4b98516b62ada4122bfa05b373b5d856c75e432ebc7b4219" gracePeriod=30 Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.807238 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45lfz"] Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.808528 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-45lfz" Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.831071 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45lfz"] Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.844935 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3d2605e3-4b9a-4dc8-8936-b209875dbdee-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-45lfz\" (UID: \"3d2605e3-4b9a-4dc8-8936-b209875dbdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-45lfz" Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.845073 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d2605e3-4b9a-4dc8-8936-b209875dbdee-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-45lfz\" (UID: \"3d2605e3-4b9a-4dc8-8936-b209875dbdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-45lfz" Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.845105 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58pc8\" (UniqueName: \"kubernetes.io/projected/3d2605e3-4b9a-4dc8-8936-b209875dbdee-kube-api-access-58pc8\") pod \"marketplace-operator-79b997595-45lfz\" (UID: \"3d2605e3-4b9a-4dc8-8936-b209875dbdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-45lfz" Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.946188 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3d2605e3-4b9a-4dc8-8936-b209875dbdee-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-45lfz\" (UID: \"3d2605e3-4b9a-4dc8-8936-b209875dbdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-45lfz" Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.946292 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d2605e3-4b9a-4dc8-8936-b209875dbdee-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-45lfz\" (UID: \"3d2605e3-4b9a-4dc8-8936-b209875dbdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-45lfz" Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.946312 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58pc8\" (UniqueName: \"kubernetes.io/projected/3d2605e3-4b9a-4dc8-8936-b209875dbdee-kube-api-access-58pc8\") pod \"marketplace-operator-79b997595-45lfz\" (UID: \"3d2605e3-4b9a-4dc8-8936-b209875dbdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-45lfz" Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.952456 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d2605e3-4b9a-4dc8-8936-b209875dbdee-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-45lfz\" (UID: \"3d2605e3-4b9a-4dc8-8936-b209875dbdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-45lfz" Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.965474 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3d2605e3-4b9a-4dc8-8936-b209875dbdee-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-45lfz\" (UID: \"3d2605e3-4b9a-4dc8-8936-b209875dbdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-45lfz" Jan 28 06:54:51 crc kubenswrapper[4776]: I0128 06:54:51.973235 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58pc8\" (UniqueName: \"kubernetes.io/projected/3d2605e3-4b9a-4dc8-8936-b209875dbdee-kube-api-access-58pc8\") pod \"marketplace-operator-79b997595-45lfz\" (UID: \"3d2605e3-4b9a-4dc8-8936-b209875dbdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-45lfz" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.128946 4776 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2vn7f container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.128998 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" podUID="97135081-7759-4edc-aa62-514c15190115" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.161733 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-45lfz" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.231736 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jvjv2" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.272748 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2f2sb" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.305771 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndmtj" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.306744 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8d9l4" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.347178 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.357508 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51afc3ef-b111-4228-859a-9ff98f2b5131-catalog-content\") pod \"51afc3ef-b111-4228-859a-9ff98f2b5131\" (UID: \"51afc3ef-b111-4228-859a-9ff98f2b5131\") " Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.357572 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d6nt\" (UniqueName: \"kubernetes.io/projected/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-kube-api-access-8d6nt\") pod \"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff\" (UID: \"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff\") " Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.357609 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-utilities\") pod \"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff\" (UID: \"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff\") " Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.357634 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-catalog-content\") pod \"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff\" (UID: \"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff\") " Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.357678 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beb166aa-d9c2-4658-af43-8d5d2eb61588-utilities\") pod \"beb166aa-d9c2-4658-af43-8d5d2eb61588\" (UID: \"beb166aa-d9c2-4658-af43-8d5d2eb61588\") " Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.357699 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d608fa02-5844-4167-831f-c754aeca5050-catalog-content\") pod \"d608fa02-5844-4167-831f-c754aeca5050\" (UID: \"d608fa02-5844-4167-831f-c754aeca5050\") " Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.357728 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8dhg\" (UniqueName: \"kubernetes.io/projected/51afc3ef-b111-4228-859a-9ff98f2b5131-kube-api-access-r8dhg\") pod \"51afc3ef-b111-4228-859a-9ff98f2b5131\" (UID: \"51afc3ef-b111-4228-859a-9ff98f2b5131\") " Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.357745 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51afc3ef-b111-4228-859a-9ff98f2b5131-utilities\") pod \"51afc3ef-b111-4228-859a-9ff98f2b5131\" (UID: \"51afc3ef-b111-4228-859a-9ff98f2b5131\") " Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.357774 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beb166aa-d9c2-4658-af43-8d5d2eb61588-catalog-content\") pod \"beb166aa-d9c2-4658-af43-8d5d2eb61588\" (UID: \"beb166aa-d9c2-4658-af43-8d5d2eb61588\") " Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.357790 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d608fa02-5844-4167-831f-c754aeca5050-utilities\") pod \"d608fa02-5844-4167-831f-c754aeca5050\" (UID: \"d608fa02-5844-4167-831f-c754aeca5050\") " Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.357812 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzvrs\" (UniqueName: \"kubernetes.io/projected/d608fa02-5844-4167-831f-c754aeca5050-kube-api-access-fzvrs\") pod \"d608fa02-5844-4167-831f-c754aeca5050\" (UID: \"d608fa02-5844-4167-831f-c754aeca5050\") " Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.357857 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxb24\" (UniqueName: \"kubernetes.io/projected/beb166aa-d9c2-4658-af43-8d5d2eb61588-kube-api-access-rxb24\") pod \"beb166aa-d9c2-4658-af43-8d5d2eb61588\" (UID: \"beb166aa-d9c2-4658-af43-8d5d2eb61588\") " Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.369232 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d608fa02-5844-4167-831f-c754aeca5050-utilities" (OuterVolumeSpecName: "utilities") pod "d608fa02-5844-4167-831f-c754aeca5050" (UID: "d608fa02-5844-4167-831f-c754aeca5050"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.373718 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-kube-api-access-8d6nt" (OuterVolumeSpecName: "kube-api-access-8d6nt") pod "8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff" (UID: "8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff"). InnerVolumeSpecName "kube-api-access-8d6nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.380092 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51afc3ef-b111-4228-859a-9ff98f2b5131-utilities" (OuterVolumeSpecName: "utilities") pod "51afc3ef-b111-4228-859a-9ff98f2b5131" (UID: "51afc3ef-b111-4228-859a-9ff98f2b5131"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.380229 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-utilities" (OuterVolumeSpecName: "utilities") pod "8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff" (UID: "8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.381923 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beb166aa-d9c2-4658-af43-8d5d2eb61588-utilities" (OuterVolumeSpecName: "utilities") pod "beb166aa-d9c2-4658-af43-8d5d2eb61588" (UID: "beb166aa-d9c2-4658-af43-8d5d2eb61588"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.407827 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb166aa-d9c2-4658-af43-8d5d2eb61588-kube-api-access-rxb24" (OuterVolumeSpecName: "kube-api-access-rxb24") pod "beb166aa-d9c2-4658-af43-8d5d2eb61588" (UID: "beb166aa-d9c2-4658-af43-8d5d2eb61588"). InnerVolumeSpecName "kube-api-access-rxb24". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.409524 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d608fa02-5844-4167-831f-c754aeca5050-kube-api-access-fzvrs" (OuterVolumeSpecName: "kube-api-access-fzvrs") pod "d608fa02-5844-4167-831f-c754aeca5050" (UID: "d608fa02-5844-4167-831f-c754aeca5050"). InnerVolumeSpecName "kube-api-access-fzvrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.428861 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d608fa02-5844-4167-831f-c754aeca5050-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d608fa02-5844-4167-831f-c754aeca5050" (UID: "d608fa02-5844-4167-831f-c754aeca5050"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.439216 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51afc3ef-b111-4228-859a-9ff98f2b5131-kube-api-access-r8dhg" (OuterVolumeSpecName: "kube-api-access-r8dhg") pod "51afc3ef-b111-4228-859a-9ff98f2b5131" (UID: "51afc3ef-b111-4228-859a-9ff98f2b5131"). InnerVolumeSpecName "kube-api-access-r8dhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.460201 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97135081-7759-4edc-aa62-514c15190115-marketplace-operator-metrics\") pod \"97135081-7759-4edc-aa62-514c15190115\" (UID: \"97135081-7759-4edc-aa62-514c15190115\") " Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.460289 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97135081-7759-4edc-aa62-514c15190115-marketplace-trusted-ca\") pod \"97135081-7759-4edc-aa62-514c15190115\" (UID: \"97135081-7759-4edc-aa62-514c15190115\") " Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.460342 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46t59\" (UniqueName: \"kubernetes.io/projected/97135081-7759-4edc-aa62-514c15190115-kube-api-access-46t59\") pod \"97135081-7759-4edc-aa62-514c15190115\" (UID: \"97135081-7759-4edc-aa62-514c15190115\") " Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.460591 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxb24\" (UniqueName: \"kubernetes.io/projected/beb166aa-d9c2-4658-af43-8d5d2eb61588-kube-api-access-rxb24\") on node \"crc\" DevicePath \"\"" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.460602 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d6nt\" (UniqueName: \"kubernetes.io/projected/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-kube-api-access-8d6nt\") on node \"crc\" DevicePath \"\"" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.460612 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.460623 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beb166aa-d9c2-4658-af43-8d5d2eb61588-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.460631 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d608fa02-5844-4167-831f-c754aeca5050-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.460639 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8dhg\" (UniqueName: \"kubernetes.io/projected/51afc3ef-b111-4228-859a-9ff98f2b5131-kube-api-access-r8dhg\") on node \"crc\" DevicePath \"\"" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.460647 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51afc3ef-b111-4228-859a-9ff98f2b5131-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.460655 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d608fa02-5844-4167-831f-c754aeca5050-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.460689 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzvrs\" (UniqueName: \"kubernetes.io/projected/d608fa02-5844-4167-831f-c754aeca5050-kube-api-access-fzvrs\") on node \"crc\" DevicePath \"\"" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.462638 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97135081-7759-4edc-aa62-514c15190115-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "97135081-7759-4edc-aa62-514c15190115" (UID: "97135081-7759-4edc-aa62-514c15190115"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.467906 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97135081-7759-4edc-aa62-514c15190115-kube-api-access-46t59" (OuterVolumeSpecName: "kube-api-access-46t59") pod "97135081-7759-4edc-aa62-514c15190115" (UID: "97135081-7759-4edc-aa62-514c15190115"). InnerVolumeSpecName "kube-api-access-46t59". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.481172 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97135081-7759-4edc-aa62-514c15190115-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "97135081-7759-4edc-aa62-514c15190115" (UID: "97135081-7759-4edc-aa62-514c15190115"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.491443 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beb166aa-d9c2-4658-af43-8d5d2eb61588-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "beb166aa-d9c2-4658-af43-8d5d2eb61588" (UID: "beb166aa-d9c2-4658-af43-8d5d2eb61588"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.502023 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45lfz"] Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.518469 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff" (UID: "8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.561926 4776 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97135081-7759-4edc-aa62-514c15190115-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.561965 4776 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97135081-7759-4edc-aa62-514c15190115-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.561975 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.561984 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46t59\" (UniqueName: \"kubernetes.io/projected/97135081-7759-4edc-aa62-514c15190115-kube-api-access-46t59\") on node \"crc\" DevicePath \"\"" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.561994 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beb166aa-d9c2-4658-af43-8d5d2eb61588-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.564293 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51afc3ef-b111-4228-859a-9ff98f2b5131-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51afc3ef-b111-4228-859a-9ff98f2b5131" (UID: "51afc3ef-b111-4228-859a-9ff98f2b5131"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.663091 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51afc3ef-b111-4228-859a-9ff98f2b5131-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.799677 4776 generic.go:334] "Generic (PLEG): container finished" podID="97135081-7759-4edc-aa62-514c15190115" containerID="b40d1f94d7d6d14cd88a9bda938b62535e8639affc510a5bf41391e3270f14cc" exitCode=0 Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.799759 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" event={"ID":"97135081-7759-4edc-aa62-514c15190115","Type":"ContainerDied","Data":"b40d1f94d7d6d14cd88a9bda938b62535e8639affc510a5bf41391e3270f14cc"} Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.799773 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.799814 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2vn7f" event={"ID":"97135081-7759-4edc-aa62-514c15190115","Type":"ContainerDied","Data":"05eebedb8da824e67350dd2d5ac32ad451d9fb7bee97a32cfbcd015f02bbe002"} Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.799833 4776 scope.go:117] "RemoveContainer" containerID="b40d1f94d7d6d14cd88a9bda938b62535e8639affc510a5bf41391e3270f14cc" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.803245 4776 generic.go:334] "Generic (PLEG): container finished" podID="d608fa02-5844-4167-831f-c754aeca5050" containerID="ab17f2ecf7bb969254e3c6b2d209cd1bdbaa470323f12fb58a092f3060c9b32b" exitCode=0 Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.803340 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndmtj" event={"ID":"d608fa02-5844-4167-831f-c754aeca5050","Type":"ContainerDied","Data":"ab17f2ecf7bb969254e3c6b2d209cd1bdbaa470323f12fb58a092f3060c9b32b"} Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.803390 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndmtj" event={"ID":"d608fa02-5844-4167-831f-c754aeca5050","Type":"ContainerDied","Data":"1d31c5bd5ba06be90a7d4575a573a56015faedd2b7c9dfe1c2c5d0e8886738a5"} Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.803516 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndmtj" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.808127 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-45lfz" event={"ID":"3d2605e3-4b9a-4dc8-8936-b209875dbdee","Type":"ContainerStarted","Data":"bd5664e09df5386fa478dc8566cd60ce2023e1c198afb7cf73256b66d2911c96"} Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.808171 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-45lfz" event={"ID":"3d2605e3-4b9a-4dc8-8936-b209875dbdee","Type":"ContainerStarted","Data":"4f660aaad7ef8c0790d17db3c587eb28211904908ab4ad98bf9d0def6a2d40ef"} Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.809915 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-45lfz" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.809995 4776 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-45lfz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" start-of-body= Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.810030 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-45lfz" podUID="3d2605e3-4b9a-4dc8-8936-b209875dbdee" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.817871 4776 generic.go:334] "Generic (PLEG): container finished" podID="8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff" containerID="cff836b66f44f11374fe7a1347f99a4228d138ecb012710c80843194ec9cbc34" exitCode=0 Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.817966 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2f2sb" event={"ID":"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff","Type":"ContainerDied","Data":"cff836b66f44f11374fe7a1347f99a4228d138ecb012710c80843194ec9cbc34"} Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.818005 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2f2sb" event={"ID":"8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff","Type":"ContainerDied","Data":"275362d2724ba94350de6664cae52731b07115cee69466d8e23e3fdf97c0e84e"} Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.818130 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2f2sb" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.822315 4776 generic.go:334] "Generic (PLEG): container finished" podID="beb166aa-d9c2-4658-af43-8d5d2eb61588" containerID="d6e9bf6730b69ed4e4b55b84540bbf5508c9e5dad1f2e54fc1bd296178a461b8" exitCode=0 Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.822382 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvjv2" event={"ID":"beb166aa-d9c2-4658-af43-8d5d2eb61588","Type":"ContainerDied","Data":"d6e9bf6730b69ed4e4b55b84540bbf5508c9e5dad1f2e54fc1bd296178a461b8"} Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.822417 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jvjv2" event={"ID":"beb166aa-d9c2-4658-af43-8d5d2eb61588","Type":"ContainerDied","Data":"89f939bc81de658ffc0709ec512f212344c350aa2b022d81a3abf4b037b4e9c3"} Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.822528 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jvjv2" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.827276 4776 scope.go:117] "RemoveContainer" containerID="0f38494f71fc8f23e68352e1eee1a97f2ae4839fb9f43d927bdf4a96b15e9980" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.827425 4776 generic.go:334] "Generic (PLEG): container finished" podID="51afc3ef-b111-4228-859a-9ff98f2b5131" containerID="60670eb50bdaaced4b98516b62ada4122bfa05b373b5d856c75e432ebc7b4219" exitCode=0 Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.827496 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8d9l4" event={"ID":"51afc3ef-b111-4228-859a-9ff98f2b5131","Type":"ContainerDied","Data":"60670eb50bdaaced4b98516b62ada4122bfa05b373b5d856c75e432ebc7b4219"} Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.827540 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8d9l4" event={"ID":"51afc3ef-b111-4228-859a-9ff98f2b5131","Type":"ContainerDied","Data":"b13455baf44ecc8d51239ba3041d7566bcfff0e003db5526d53cdfa856142698"} Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.827675 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8d9l4" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.842460 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-45lfz" podStartSLOduration=1.8424248589999999 podStartE2EDuration="1.842424859s" podCreationTimestamp="2026-01-28 06:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 06:54:52.832878983 +0000 UTC m=+264.248539193" watchObservedRunningTime="2026-01-28 06:54:52.842424859 +0000 UTC m=+264.258085039" Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.863883 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2vn7f"] Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.867338 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2vn7f"] Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.876865 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndmtj"] Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.884752 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndmtj"] Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.894148 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jvjv2"] Jan 28 06:54:52 crc kubenswrapper[4776]: I0128 06:54:52.895625 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jvjv2"] Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.086081 4776 scope.go:117] "RemoveContainer" containerID="b40d1f94d7d6d14cd88a9bda938b62535e8639affc510a5bf41391e3270f14cc" Jan 28 06:54:53 crc kubenswrapper[4776]: E0128 06:54:53.086826 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b40d1f94d7d6d14cd88a9bda938b62535e8639affc510a5bf41391e3270f14cc\": container with ID starting with b40d1f94d7d6d14cd88a9bda938b62535e8639affc510a5bf41391e3270f14cc not found: ID does not exist" containerID="b40d1f94d7d6d14cd88a9bda938b62535e8639affc510a5bf41391e3270f14cc" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.086881 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b40d1f94d7d6d14cd88a9bda938b62535e8639affc510a5bf41391e3270f14cc"} err="failed to get container status \"b40d1f94d7d6d14cd88a9bda938b62535e8639affc510a5bf41391e3270f14cc\": rpc error: code = NotFound desc = could not find container \"b40d1f94d7d6d14cd88a9bda938b62535e8639affc510a5bf41391e3270f14cc\": container with ID starting with b40d1f94d7d6d14cd88a9bda938b62535e8639affc510a5bf41391e3270f14cc not found: ID does not exist" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.086916 4776 scope.go:117] "RemoveContainer" containerID="0f38494f71fc8f23e68352e1eee1a97f2ae4839fb9f43d927bdf4a96b15e9980" Jan 28 06:54:53 crc kubenswrapper[4776]: E0128 06:54:53.087385 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f38494f71fc8f23e68352e1eee1a97f2ae4839fb9f43d927bdf4a96b15e9980\": container with ID starting with 0f38494f71fc8f23e68352e1eee1a97f2ae4839fb9f43d927bdf4a96b15e9980 not found: ID does not exist" containerID="0f38494f71fc8f23e68352e1eee1a97f2ae4839fb9f43d927bdf4a96b15e9980" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.087431 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f38494f71fc8f23e68352e1eee1a97f2ae4839fb9f43d927bdf4a96b15e9980"} err="failed to get container status \"0f38494f71fc8f23e68352e1eee1a97f2ae4839fb9f43d927bdf4a96b15e9980\": rpc error: code = NotFound desc = could not find container \"0f38494f71fc8f23e68352e1eee1a97f2ae4839fb9f43d927bdf4a96b15e9980\": container with ID starting with 0f38494f71fc8f23e68352e1eee1a97f2ae4839fb9f43d927bdf4a96b15e9980 not found: ID does not exist" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.087464 4776 scope.go:117] "RemoveContainer" containerID="ab17f2ecf7bb969254e3c6b2d209cd1bdbaa470323f12fb58a092f3060c9b32b" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.154831 4776 scope.go:117] "RemoveContainer" containerID="74bc40e8465ae3813ea6d030c0547320cddf2e1fb91d5ce5185e0df80b95f502" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.167070 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8d9l4"] Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.170665 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8d9l4"] Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.179134 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2f2sb"] Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.181669 4776 scope.go:117] "RemoveContainer" containerID="f3359a69575e229fbae06fee69f5bdc22c8708ae9efa2a6124a07fa07bfade0a" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.184226 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2f2sb"] Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.200149 4776 scope.go:117] "RemoveContainer" containerID="ab17f2ecf7bb969254e3c6b2d209cd1bdbaa470323f12fb58a092f3060c9b32b" Jan 28 06:54:53 crc kubenswrapper[4776]: E0128 06:54:53.200620 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab17f2ecf7bb969254e3c6b2d209cd1bdbaa470323f12fb58a092f3060c9b32b\": container with ID starting with ab17f2ecf7bb969254e3c6b2d209cd1bdbaa470323f12fb58a092f3060c9b32b not found: ID does not exist" containerID="ab17f2ecf7bb969254e3c6b2d209cd1bdbaa470323f12fb58a092f3060c9b32b" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.200653 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab17f2ecf7bb969254e3c6b2d209cd1bdbaa470323f12fb58a092f3060c9b32b"} err="failed to get container status \"ab17f2ecf7bb969254e3c6b2d209cd1bdbaa470323f12fb58a092f3060c9b32b\": rpc error: code = NotFound desc = could not find container \"ab17f2ecf7bb969254e3c6b2d209cd1bdbaa470323f12fb58a092f3060c9b32b\": container with ID starting with ab17f2ecf7bb969254e3c6b2d209cd1bdbaa470323f12fb58a092f3060c9b32b not found: ID does not exist" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.200675 4776 scope.go:117] "RemoveContainer" containerID="74bc40e8465ae3813ea6d030c0547320cddf2e1fb91d5ce5185e0df80b95f502" Jan 28 06:54:53 crc kubenswrapper[4776]: E0128 06:54:53.200932 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74bc40e8465ae3813ea6d030c0547320cddf2e1fb91d5ce5185e0df80b95f502\": container with ID starting with 74bc40e8465ae3813ea6d030c0547320cddf2e1fb91d5ce5185e0df80b95f502 not found: ID does not exist" containerID="74bc40e8465ae3813ea6d030c0547320cddf2e1fb91d5ce5185e0df80b95f502" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.200954 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74bc40e8465ae3813ea6d030c0547320cddf2e1fb91d5ce5185e0df80b95f502"} err="failed to get container status \"74bc40e8465ae3813ea6d030c0547320cddf2e1fb91d5ce5185e0df80b95f502\": rpc error: code = NotFound desc = could not find container \"74bc40e8465ae3813ea6d030c0547320cddf2e1fb91d5ce5185e0df80b95f502\": container with ID starting with 74bc40e8465ae3813ea6d030c0547320cddf2e1fb91d5ce5185e0df80b95f502 not found: ID does not exist" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.200971 4776 scope.go:117] "RemoveContainer" containerID="f3359a69575e229fbae06fee69f5bdc22c8708ae9efa2a6124a07fa07bfade0a" Jan 28 06:54:53 crc kubenswrapper[4776]: E0128 06:54:53.201276 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3359a69575e229fbae06fee69f5bdc22c8708ae9efa2a6124a07fa07bfade0a\": container with ID starting with f3359a69575e229fbae06fee69f5bdc22c8708ae9efa2a6124a07fa07bfade0a not found: ID does not exist" containerID="f3359a69575e229fbae06fee69f5bdc22c8708ae9efa2a6124a07fa07bfade0a" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.201296 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3359a69575e229fbae06fee69f5bdc22c8708ae9efa2a6124a07fa07bfade0a"} err="failed to get container status \"f3359a69575e229fbae06fee69f5bdc22c8708ae9efa2a6124a07fa07bfade0a\": rpc error: code = NotFound desc = could not find container \"f3359a69575e229fbae06fee69f5bdc22c8708ae9efa2a6124a07fa07bfade0a\": container with ID starting with f3359a69575e229fbae06fee69f5bdc22c8708ae9efa2a6124a07fa07bfade0a not found: ID does not exist" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.201309 4776 scope.go:117] "RemoveContainer" containerID="cff836b66f44f11374fe7a1347f99a4228d138ecb012710c80843194ec9cbc34" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.214628 4776 scope.go:117] "RemoveContainer" containerID="cef28429e1bf4ed60139d522de3ddc3922a4e3b709dc2d51b4bcced9fcdbf8b6" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.230966 4776 scope.go:117] "RemoveContainer" containerID="801989a52306d24cae6fc4a895ebbe2babc18cc552cadd9fce9198d8cdc0d542" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.252950 4776 scope.go:117] "RemoveContainer" containerID="cff836b66f44f11374fe7a1347f99a4228d138ecb012710c80843194ec9cbc34" Jan 28 06:54:53 crc kubenswrapper[4776]: E0128 06:54:53.253452 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cff836b66f44f11374fe7a1347f99a4228d138ecb012710c80843194ec9cbc34\": container with ID starting with cff836b66f44f11374fe7a1347f99a4228d138ecb012710c80843194ec9cbc34 not found: ID does not exist" containerID="cff836b66f44f11374fe7a1347f99a4228d138ecb012710c80843194ec9cbc34" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.253477 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cff836b66f44f11374fe7a1347f99a4228d138ecb012710c80843194ec9cbc34"} err="failed to get container status \"cff836b66f44f11374fe7a1347f99a4228d138ecb012710c80843194ec9cbc34\": rpc error: code = NotFound desc = could not find container \"cff836b66f44f11374fe7a1347f99a4228d138ecb012710c80843194ec9cbc34\": container with ID starting with cff836b66f44f11374fe7a1347f99a4228d138ecb012710c80843194ec9cbc34 not found: ID does not exist" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.253503 4776 scope.go:117] "RemoveContainer" containerID="cef28429e1bf4ed60139d522de3ddc3922a4e3b709dc2d51b4bcced9fcdbf8b6" Jan 28 06:54:53 crc kubenswrapper[4776]: E0128 06:54:53.253842 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef28429e1bf4ed60139d522de3ddc3922a4e3b709dc2d51b4bcced9fcdbf8b6\": container with ID starting with cef28429e1bf4ed60139d522de3ddc3922a4e3b709dc2d51b4bcced9fcdbf8b6 not found: ID does not exist" containerID="cef28429e1bf4ed60139d522de3ddc3922a4e3b709dc2d51b4bcced9fcdbf8b6" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.253860 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef28429e1bf4ed60139d522de3ddc3922a4e3b709dc2d51b4bcced9fcdbf8b6"} err="failed to get container status \"cef28429e1bf4ed60139d522de3ddc3922a4e3b709dc2d51b4bcced9fcdbf8b6\": rpc error: code = NotFound desc = could not find container \"cef28429e1bf4ed60139d522de3ddc3922a4e3b709dc2d51b4bcced9fcdbf8b6\": container with ID starting with cef28429e1bf4ed60139d522de3ddc3922a4e3b709dc2d51b4bcced9fcdbf8b6 not found: ID does not exist" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.253874 4776 scope.go:117] "RemoveContainer" containerID="801989a52306d24cae6fc4a895ebbe2babc18cc552cadd9fce9198d8cdc0d542" Jan 28 06:54:53 crc kubenswrapper[4776]: E0128 06:54:53.254074 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"801989a52306d24cae6fc4a895ebbe2babc18cc552cadd9fce9198d8cdc0d542\": container with ID starting with 801989a52306d24cae6fc4a895ebbe2babc18cc552cadd9fce9198d8cdc0d542 not found: ID does not exist" containerID="801989a52306d24cae6fc4a895ebbe2babc18cc552cadd9fce9198d8cdc0d542" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.254090 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"801989a52306d24cae6fc4a895ebbe2babc18cc552cadd9fce9198d8cdc0d542"} err="failed to get container status \"801989a52306d24cae6fc4a895ebbe2babc18cc552cadd9fce9198d8cdc0d542\": rpc error: code = NotFound desc = could not find container \"801989a52306d24cae6fc4a895ebbe2babc18cc552cadd9fce9198d8cdc0d542\": container with ID starting with 801989a52306d24cae6fc4a895ebbe2babc18cc552cadd9fce9198d8cdc0d542 not found: ID does not exist" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.254102 4776 scope.go:117] "RemoveContainer" containerID="d6e9bf6730b69ed4e4b55b84540bbf5508c9e5dad1f2e54fc1bd296178a461b8" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.267038 4776 scope.go:117] "RemoveContainer" containerID="063124a05d4923bd110ec2a4d146263e532ff1664e593b4825968f0c8854f132" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.291371 4776 scope.go:117] "RemoveContainer" containerID="b375e2f6050d3c336d459787e66ea2e98ec81e4156d628ef0794c49d4f7f8ad0" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.309694 4776 scope.go:117] "RemoveContainer" containerID="d6e9bf6730b69ed4e4b55b84540bbf5508c9e5dad1f2e54fc1bd296178a461b8" Jan 28 06:54:53 crc kubenswrapper[4776]: E0128 06:54:53.310079 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6e9bf6730b69ed4e4b55b84540bbf5508c9e5dad1f2e54fc1bd296178a461b8\": container with ID starting with d6e9bf6730b69ed4e4b55b84540bbf5508c9e5dad1f2e54fc1bd296178a461b8 not found: ID does not exist" containerID="d6e9bf6730b69ed4e4b55b84540bbf5508c9e5dad1f2e54fc1bd296178a461b8" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.310228 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e9bf6730b69ed4e4b55b84540bbf5508c9e5dad1f2e54fc1bd296178a461b8"} err="failed to get container status \"d6e9bf6730b69ed4e4b55b84540bbf5508c9e5dad1f2e54fc1bd296178a461b8\": rpc error: code = NotFound desc = could not find container \"d6e9bf6730b69ed4e4b55b84540bbf5508c9e5dad1f2e54fc1bd296178a461b8\": container with ID starting with d6e9bf6730b69ed4e4b55b84540bbf5508c9e5dad1f2e54fc1bd296178a461b8 not found: ID does not exist" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.310278 4776 scope.go:117] "RemoveContainer" containerID="063124a05d4923bd110ec2a4d146263e532ff1664e593b4825968f0c8854f132" Jan 28 06:54:53 crc kubenswrapper[4776]: E0128 06:54:53.310703 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"063124a05d4923bd110ec2a4d146263e532ff1664e593b4825968f0c8854f132\": container with ID starting with 063124a05d4923bd110ec2a4d146263e532ff1664e593b4825968f0c8854f132 not found: ID does not exist" containerID="063124a05d4923bd110ec2a4d146263e532ff1664e593b4825968f0c8854f132" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.310736 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"063124a05d4923bd110ec2a4d146263e532ff1664e593b4825968f0c8854f132"} err="failed to get container status \"063124a05d4923bd110ec2a4d146263e532ff1664e593b4825968f0c8854f132\": rpc error: code = NotFound desc = could not find container \"063124a05d4923bd110ec2a4d146263e532ff1664e593b4825968f0c8854f132\": container with ID starting with 063124a05d4923bd110ec2a4d146263e532ff1664e593b4825968f0c8854f132 not found: ID does not exist" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.310755 4776 scope.go:117] "RemoveContainer" containerID="b375e2f6050d3c336d459787e66ea2e98ec81e4156d628ef0794c49d4f7f8ad0" Jan 28 06:54:53 crc kubenswrapper[4776]: E0128 06:54:53.311229 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b375e2f6050d3c336d459787e66ea2e98ec81e4156d628ef0794c49d4f7f8ad0\": container with ID starting with b375e2f6050d3c336d459787e66ea2e98ec81e4156d628ef0794c49d4f7f8ad0 not found: ID does not exist" containerID="b375e2f6050d3c336d459787e66ea2e98ec81e4156d628ef0794c49d4f7f8ad0" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.311280 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b375e2f6050d3c336d459787e66ea2e98ec81e4156d628ef0794c49d4f7f8ad0"} err="failed to get container status \"b375e2f6050d3c336d459787e66ea2e98ec81e4156d628ef0794c49d4f7f8ad0\": rpc error: code = NotFound desc = could not find container \"b375e2f6050d3c336d459787e66ea2e98ec81e4156d628ef0794c49d4f7f8ad0\": container with ID starting with b375e2f6050d3c336d459787e66ea2e98ec81e4156d628ef0794c49d4f7f8ad0 not found: ID does not exist" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.311325 4776 scope.go:117] "RemoveContainer" containerID="60670eb50bdaaced4b98516b62ada4122bfa05b373b5d856c75e432ebc7b4219" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.318077 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51afc3ef-b111-4228-859a-9ff98f2b5131" path="/var/lib/kubelet/pods/51afc3ef-b111-4228-859a-9ff98f2b5131/volumes" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.318875 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff" path="/var/lib/kubelet/pods/8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff/volumes" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.319715 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97135081-7759-4edc-aa62-514c15190115" path="/var/lib/kubelet/pods/97135081-7759-4edc-aa62-514c15190115/volumes" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.320916 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beb166aa-d9c2-4658-af43-8d5d2eb61588" path="/var/lib/kubelet/pods/beb166aa-d9c2-4658-af43-8d5d2eb61588/volumes" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.321679 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d608fa02-5844-4167-831f-c754aeca5050" path="/var/lib/kubelet/pods/d608fa02-5844-4167-831f-c754aeca5050/volumes" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.324538 4776 scope.go:117] "RemoveContainer" containerID="a755a020f150595a8680c0d8776ae5a9a5157e02ddf45ebaa1075d0b7b1d0a1c" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.341597 4776 scope.go:117] "RemoveContainer" containerID="54caddc4c684cc2b6de0e2157292e4f00fab4ade2a198da83a99fba7fc3dd90f" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.355842 4776 scope.go:117] "RemoveContainer" containerID="60670eb50bdaaced4b98516b62ada4122bfa05b373b5d856c75e432ebc7b4219" Jan 28 06:54:53 crc kubenswrapper[4776]: E0128 06:54:53.356385 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60670eb50bdaaced4b98516b62ada4122bfa05b373b5d856c75e432ebc7b4219\": container with ID starting with 60670eb50bdaaced4b98516b62ada4122bfa05b373b5d856c75e432ebc7b4219 not found: ID does not exist" containerID="60670eb50bdaaced4b98516b62ada4122bfa05b373b5d856c75e432ebc7b4219" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.356425 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60670eb50bdaaced4b98516b62ada4122bfa05b373b5d856c75e432ebc7b4219"} err="failed to get container status \"60670eb50bdaaced4b98516b62ada4122bfa05b373b5d856c75e432ebc7b4219\": rpc error: code = NotFound desc = could not find container \"60670eb50bdaaced4b98516b62ada4122bfa05b373b5d856c75e432ebc7b4219\": container with ID starting with 60670eb50bdaaced4b98516b62ada4122bfa05b373b5d856c75e432ebc7b4219 not found: ID does not exist" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.356454 4776 scope.go:117] "RemoveContainer" containerID="a755a020f150595a8680c0d8776ae5a9a5157e02ddf45ebaa1075d0b7b1d0a1c" Jan 28 06:54:53 crc kubenswrapper[4776]: E0128 06:54:53.356901 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a755a020f150595a8680c0d8776ae5a9a5157e02ddf45ebaa1075d0b7b1d0a1c\": container with ID starting with a755a020f150595a8680c0d8776ae5a9a5157e02ddf45ebaa1075d0b7b1d0a1c not found: ID does not exist" containerID="a755a020f150595a8680c0d8776ae5a9a5157e02ddf45ebaa1075d0b7b1d0a1c" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.356947 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a755a020f150595a8680c0d8776ae5a9a5157e02ddf45ebaa1075d0b7b1d0a1c"} err="failed to get container status \"a755a020f150595a8680c0d8776ae5a9a5157e02ddf45ebaa1075d0b7b1d0a1c\": rpc error: code = NotFound desc = could not find container \"a755a020f150595a8680c0d8776ae5a9a5157e02ddf45ebaa1075d0b7b1d0a1c\": container with ID starting with a755a020f150595a8680c0d8776ae5a9a5157e02ddf45ebaa1075d0b7b1d0a1c not found: ID does not exist" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.356981 4776 scope.go:117] "RemoveContainer" containerID="54caddc4c684cc2b6de0e2157292e4f00fab4ade2a198da83a99fba7fc3dd90f" Jan 28 06:54:53 crc kubenswrapper[4776]: E0128 06:54:53.357366 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54caddc4c684cc2b6de0e2157292e4f00fab4ade2a198da83a99fba7fc3dd90f\": container with ID starting with 54caddc4c684cc2b6de0e2157292e4f00fab4ade2a198da83a99fba7fc3dd90f not found: ID does not exist" containerID="54caddc4c684cc2b6de0e2157292e4f00fab4ade2a198da83a99fba7fc3dd90f" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.357410 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54caddc4c684cc2b6de0e2157292e4f00fab4ade2a198da83a99fba7fc3dd90f"} err="failed to get container status \"54caddc4c684cc2b6de0e2157292e4f00fab4ade2a198da83a99fba7fc3dd90f\": rpc error: code = NotFound desc = could not find container \"54caddc4c684cc2b6de0e2157292e4f00fab4ade2a198da83a99fba7fc3dd90f\": container with ID starting with 54caddc4c684cc2b6de0e2157292e4f00fab4ade2a198da83a99fba7fc3dd90f not found: ID does not exist" Jan 28 06:54:53 crc kubenswrapper[4776]: I0128 06:54:53.847826 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-45lfz" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.146819 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sfr2v"] Jan 28 06:54:54 crc kubenswrapper[4776]: E0128 06:54:54.147045 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97135081-7759-4edc-aa62-514c15190115" containerName="marketplace-operator" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147063 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="97135081-7759-4edc-aa62-514c15190115" containerName="marketplace-operator" Jan 28 06:54:54 crc kubenswrapper[4776]: E0128 06:54:54.147073 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beb166aa-d9c2-4658-af43-8d5d2eb61588" containerName="registry-server" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147080 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb166aa-d9c2-4658-af43-8d5d2eb61588" containerName="registry-server" Jan 28 06:54:54 crc kubenswrapper[4776]: E0128 06:54:54.147089 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97135081-7759-4edc-aa62-514c15190115" containerName="marketplace-operator" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147095 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="97135081-7759-4edc-aa62-514c15190115" containerName="marketplace-operator" Jan 28 06:54:54 crc kubenswrapper[4776]: E0128 06:54:54.147102 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51afc3ef-b111-4228-859a-9ff98f2b5131" containerName="extract-utilities" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147110 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="51afc3ef-b111-4228-859a-9ff98f2b5131" containerName="extract-utilities" Jan 28 06:54:54 crc kubenswrapper[4776]: E0128 06:54:54.147119 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d608fa02-5844-4167-831f-c754aeca5050" containerName="extract-utilities" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147126 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d608fa02-5844-4167-831f-c754aeca5050" containerName="extract-utilities" Jan 28 06:54:54 crc kubenswrapper[4776]: E0128 06:54:54.147138 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d608fa02-5844-4167-831f-c754aeca5050" containerName="extract-content" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147145 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d608fa02-5844-4167-831f-c754aeca5050" containerName="extract-content" Jan 28 06:54:54 crc kubenswrapper[4776]: E0128 06:54:54.147155 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51afc3ef-b111-4228-859a-9ff98f2b5131" containerName="registry-server" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147162 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="51afc3ef-b111-4228-859a-9ff98f2b5131" containerName="registry-server" Jan 28 06:54:54 crc kubenswrapper[4776]: E0128 06:54:54.147172 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51afc3ef-b111-4228-859a-9ff98f2b5131" containerName="extract-content" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147181 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="51afc3ef-b111-4228-859a-9ff98f2b5131" containerName="extract-content" Jan 28 06:54:54 crc kubenswrapper[4776]: E0128 06:54:54.147192 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff" containerName="extract-content" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147199 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff" containerName="extract-content" Jan 28 06:54:54 crc kubenswrapper[4776]: E0128 06:54:54.147206 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff" containerName="registry-server" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147212 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff" containerName="registry-server" Jan 28 06:54:54 crc kubenswrapper[4776]: E0128 06:54:54.147224 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beb166aa-d9c2-4658-af43-8d5d2eb61588" containerName="extract-utilities" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147230 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb166aa-d9c2-4658-af43-8d5d2eb61588" containerName="extract-utilities" Jan 28 06:54:54 crc kubenswrapper[4776]: E0128 06:54:54.147239 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff" containerName="extract-utilities" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147244 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff" containerName="extract-utilities" Jan 28 06:54:54 crc kubenswrapper[4776]: E0128 06:54:54.147252 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d608fa02-5844-4167-831f-c754aeca5050" containerName="registry-server" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147257 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d608fa02-5844-4167-831f-c754aeca5050" containerName="registry-server" Jan 28 06:54:54 crc kubenswrapper[4776]: E0128 06:54:54.147264 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beb166aa-d9c2-4658-af43-8d5d2eb61588" containerName="extract-content" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147269 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb166aa-d9c2-4658-af43-8d5d2eb61588" containerName="extract-content" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147380 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="97135081-7759-4edc-aa62-514c15190115" containerName="marketplace-operator" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147395 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="51afc3ef-b111-4228-859a-9ff98f2b5131" containerName="registry-server" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147403 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="97135081-7759-4edc-aa62-514c15190115" containerName="marketplace-operator" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147416 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="beb166aa-d9c2-4658-af43-8d5d2eb61588" containerName="registry-server" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147425 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="d608fa02-5844-4167-831f-c754aeca5050" containerName="registry-server" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.147433 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc42f4d-e0d8-4b5d-8e1d-58b34f9aa0ff" containerName="registry-server" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.148159 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfr2v" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.149969 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.163317 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sfr2v"] Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.297197 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe549a5-e8f7-4868-9420-f64ff851880e-catalog-content\") pod \"certified-operators-sfr2v\" (UID: \"fbe549a5-e8f7-4868-9420-f64ff851880e\") " pod="openshift-marketplace/certified-operators-sfr2v" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.297273 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqb27\" (UniqueName: \"kubernetes.io/projected/fbe549a5-e8f7-4868-9420-f64ff851880e-kube-api-access-gqb27\") pod \"certified-operators-sfr2v\" (UID: \"fbe549a5-e8f7-4868-9420-f64ff851880e\") " pod="openshift-marketplace/certified-operators-sfr2v" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.297315 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe549a5-e8f7-4868-9420-f64ff851880e-utilities\") pod \"certified-operators-sfr2v\" (UID: \"fbe549a5-e8f7-4868-9420-f64ff851880e\") " pod="openshift-marketplace/certified-operators-sfr2v" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.348633 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x677g"] Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.349833 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x677g" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.352391 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.363154 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x677g"] Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.398858 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe549a5-e8f7-4868-9420-f64ff851880e-catalog-content\") pod \"certified-operators-sfr2v\" (UID: \"fbe549a5-e8f7-4868-9420-f64ff851880e\") " pod="openshift-marketplace/certified-operators-sfr2v" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.398945 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqb27\" (UniqueName: \"kubernetes.io/projected/fbe549a5-e8f7-4868-9420-f64ff851880e-kube-api-access-gqb27\") pod \"certified-operators-sfr2v\" (UID: \"fbe549a5-e8f7-4868-9420-f64ff851880e\") " pod="openshift-marketplace/certified-operators-sfr2v" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.399001 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe549a5-e8f7-4868-9420-f64ff851880e-utilities\") pod \"certified-operators-sfr2v\" (UID: \"fbe549a5-e8f7-4868-9420-f64ff851880e\") " pod="openshift-marketplace/certified-operators-sfr2v" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.399538 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe549a5-e8f7-4868-9420-f64ff851880e-utilities\") pod \"certified-operators-sfr2v\" (UID: \"fbe549a5-e8f7-4868-9420-f64ff851880e\") " pod="openshift-marketplace/certified-operators-sfr2v" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.399959 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe549a5-e8f7-4868-9420-f64ff851880e-catalog-content\") pod \"certified-operators-sfr2v\" (UID: \"fbe549a5-e8f7-4868-9420-f64ff851880e\") " pod="openshift-marketplace/certified-operators-sfr2v" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.419035 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqb27\" (UniqueName: \"kubernetes.io/projected/fbe549a5-e8f7-4868-9420-f64ff851880e-kube-api-access-gqb27\") pod \"certified-operators-sfr2v\" (UID: \"fbe549a5-e8f7-4868-9420-f64ff851880e\") " pod="openshift-marketplace/certified-operators-sfr2v" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.501337 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-utilities\") pod \"community-operators-x677g\" (UID: \"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5\") " pod="openshift-marketplace/community-operators-x677g" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.502083 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv6qq\" (UniqueName: \"kubernetes.io/projected/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-kube-api-access-xv6qq\") pod \"community-operators-x677g\" (UID: \"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5\") " pod="openshift-marketplace/community-operators-x677g" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.502159 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-catalog-content\") pod \"community-operators-x677g\" (UID: \"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5\") " pod="openshift-marketplace/community-operators-x677g" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.502461 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfr2v" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.603662 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv6qq\" (UniqueName: \"kubernetes.io/projected/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-kube-api-access-xv6qq\") pod \"community-operators-x677g\" (UID: \"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5\") " pod="openshift-marketplace/community-operators-x677g" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.604047 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-catalog-content\") pod \"community-operators-x677g\" (UID: \"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5\") " pod="openshift-marketplace/community-operators-x677g" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.604121 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-utilities\") pod \"community-operators-x677g\" (UID: \"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5\") " pod="openshift-marketplace/community-operators-x677g" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.604638 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-catalog-content\") pod \"community-operators-x677g\" (UID: \"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5\") " pod="openshift-marketplace/community-operators-x677g" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.605153 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-utilities\") pod \"community-operators-x677g\" (UID: \"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5\") " pod="openshift-marketplace/community-operators-x677g" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.625616 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv6qq\" (UniqueName: \"kubernetes.io/projected/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-kube-api-access-xv6qq\") pod \"community-operators-x677g\" (UID: \"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5\") " pod="openshift-marketplace/community-operators-x677g" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.665616 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x677g" Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.706887 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sfr2v"] Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.850284 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfr2v" event={"ID":"fbe549a5-e8f7-4868-9420-f64ff851880e","Type":"ContainerStarted","Data":"8ff9ed36f35f5355cedce14b0a7e92329409441de3e74aa1c2fc7bd4f27a0dcb"} Jan 28 06:54:54 crc kubenswrapper[4776]: I0128 06:54:54.873770 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x677g"] Jan 28 06:54:55 crc kubenswrapper[4776]: I0128 06:54:55.857122 4776 generic.go:334] "Generic (PLEG): container finished" podID="fbe549a5-e8f7-4868-9420-f64ff851880e" containerID="6e612e5cfd3ce240aae5fe861d6e696bf4a1c284c305b831cb64c6a6526bcf5b" exitCode=0 Jan 28 06:54:55 crc kubenswrapper[4776]: I0128 06:54:55.857226 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfr2v" event={"ID":"fbe549a5-e8f7-4868-9420-f64ff851880e","Type":"ContainerDied","Data":"6e612e5cfd3ce240aae5fe861d6e696bf4a1c284c305b831cb64c6a6526bcf5b"} Jan 28 06:54:55 crc kubenswrapper[4776]: I0128 06:54:55.861151 4776 generic.go:334] "Generic (PLEG): container finished" podID="1b7edb3b-4ee6-4b78-b0de-3715cc630ab5" containerID="2dfdbc0a4d813d18fda08d0e643ee92e65af8c0742c4053502fb15ed1580e7a9" exitCode=0 Jan 28 06:54:55 crc kubenswrapper[4776]: I0128 06:54:55.861951 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x677g" event={"ID":"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5","Type":"ContainerDied","Data":"2dfdbc0a4d813d18fda08d0e643ee92e65af8c0742c4053502fb15ed1580e7a9"} Jan 28 06:54:55 crc kubenswrapper[4776]: I0128 06:54:55.861990 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x677g" event={"ID":"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5","Type":"ContainerStarted","Data":"1a56efb7eacc85eaa6734f94881019cab2996e28a71a2234bfc661ede28b2fd4"} Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.547730 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6fkfh"] Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.548970 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fkfh" Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.551454 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.563137 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fkfh"] Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.627383 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4972ca48-b9b2-4811-9d6a-15aef7b4a2c1-utilities\") pod \"redhat-marketplace-6fkfh\" (UID: \"4972ca48-b9b2-4811-9d6a-15aef7b4a2c1\") " pod="openshift-marketplace/redhat-marketplace-6fkfh" Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.627514 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4972ca48-b9b2-4811-9d6a-15aef7b4a2c1-catalog-content\") pod \"redhat-marketplace-6fkfh\" (UID: \"4972ca48-b9b2-4811-9d6a-15aef7b4a2c1\") " pod="openshift-marketplace/redhat-marketplace-6fkfh" Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.627694 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scxrr\" (UniqueName: \"kubernetes.io/projected/4972ca48-b9b2-4811-9d6a-15aef7b4a2c1-kube-api-access-scxrr\") pod \"redhat-marketplace-6fkfh\" (UID: \"4972ca48-b9b2-4811-9d6a-15aef7b4a2c1\") " pod="openshift-marketplace/redhat-marketplace-6fkfh" Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.728645 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4972ca48-b9b2-4811-9d6a-15aef7b4a2c1-catalog-content\") pod \"redhat-marketplace-6fkfh\" (UID: \"4972ca48-b9b2-4811-9d6a-15aef7b4a2c1\") " pod="openshift-marketplace/redhat-marketplace-6fkfh" Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.728710 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scxrr\" (UniqueName: \"kubernetes.io/projected/4972ca48-b9b2-4811-9d6a-15aef7b4a2c1-kube-api-access-scxrr\") pod \"redhat-marketplace-6fkfh\" (UID: \"4972ca48-b9b2-4811-9d6a-15aef7b4a2c1\") " pod="openshift-marketplace/redhat-marketplace-6fkfh" Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.728760 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4972ca48-b9b2-4811-9d6a-15aef7b4a2c1-utilities\") pod \"redhat-marketplace-6fkfh\" (UID: \"4972ca48-b9b2-4811-9d6a-15aef7b4a2c1\") " pod="openshift-marketplace/redhat-marketplace-6fkfh" Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.729241 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4972ca48-b9b2-4811-9d6a-15aef7b4a2c1-utilities\") pod \"redhat-marketplace-6fkfh\" (UID: \"4972ca48-b9b2-4811-9d6a-15aef7b4a2c1\") " pod="openshift-marketplace/redhat-marketplace-6fkfh" Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.729513 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4972ca48-b9b2-4811-9d6a-15aef7b4a2c1-catalog-content\") pod \"redhat-marketplace-6fkfh\" (UID: \"4972ca48-b9b2-4811-9d6a-15aef7b4a2c1\") " pod="openshift-marketplace/redhat-marketplace-6fkfh" Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.751903 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ft6p9"] Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.752914 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scxrr\" (UniqueName: \"kubernetes.io/projected/4972ca48-b9b2-4811-9d6a-15aef7b4a2c1-kube-api-access-scxrr\") pod \"redhat-marketplace-6fkfh\" (UID: \"4972ca48-b9b2-4811-9d6a-15aef7b4a2c1\") " pod="openshift-marketplace/redhat-marketplace-6fkfh" Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.753439 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ft6p9" Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.757628 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.763739 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ft6p9"] Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.867018 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfr2v" event={"ID":"fbe549a5-e8f7-4868-9420-f64ff851880e","Type":"ContainerStarted","Data":"f4d509684a872610b3c457e40c9c0f5f25e0b2fdca819c50ea3b8c31b51b2ad8"} Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.876281 4776 generic.go:334] "Generic (PLEG): container finished" podID="1b7edb3b-4ee6-4b78-b0de-3715cc630ab5" containerID="e6b2383f0420ea8f2351d6a92f6414954af6b36322072384367e1f79a7b7d455" exitCode=0 Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.876330 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x677g" event={"ID":"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5","Type":"ContainerDied","Data":"e6b2383f0420ea8f2351d6a92f6414954af6b36322072384367e1f79a7b7d455"} Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.882873 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fkfh" Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.931185 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae5aaed6-76ba-4b87-aafc-a96a98df7b3c-catalog-content\") pod \"redhat-operators-ft6p9\" (UID: \"ae5aaed6-76ba-4b87-aafc-a96a98df7b3c\") " pod="openshift-marketplace/redhat-operators-ft6p9" Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.931620 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvk6f\" (UniqueName: \"kubernetes.io/projected/ae5aaed6-76ba-4b87-aafc-a96a98df7b3c-kube-api-access-jvk6f\") pod \"redhat-operators-ft6p9\" (UID: \"ae5aaed6-76ba-4b87-aafc-a96a98df7b3c\") " pod="openshift-marketplace/redhat-operators-ft6p9" Jan 28 06:54:56 crc kubenswrapper[4776]: I0128 06:54:56.931696 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae5aaed6-76ba-4b87-aafc-a96a98df7b3c-utilities\") pod \"redhat-operators-ft6p9\" (UID: \"ae5aaed6-76ba-4b87-aafc-a96a98df7b3c\") " pod="openshift-marketplace/redhat-operators-ft6p9" Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.034846 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae5aaed6-76ba-4b87-aafc-a96a98df7b3c-catalog-content\") pod \"redhat-operators-ft6p9\" (UID: \"ae5aaed6-76ba-4b87-aafc-a96a98df7b3c\") " pod="openshift-marketplace/redhat-operators-ft6p9" Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.034940 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvk6f\" (UniqueName: \"kubernetes.io/projected/ae5aaed6-76ba-4b87-aafc-a96a98df7b3c-kube-api-access-jvk6f\") pod \"redhat-operators-ft6p9\" (UID: \"ae5aaed6-76ba-4b87-aafc-a96a98df7b3c\") " pod="openshift-marketplace/redhat-operators-ft6p9" Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.035249 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae5aaed6-76ba-4b87-aafc-a96a98df7b3c-utilities\") pod \"redhat-operators-ft6p9\" (UID: \"ae5aaed6-76ba-4b87-aafc-a96a98df7b3c\") " pod="openshift-marketplace/redhat-operators-ft6p9" Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.036320 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae5aaed6-76ba-4b87-aafc-a96a98df7b3c-utilities\") pod \"redhat-operators-ft6p9\" (UID: \"ae5aaed6-76ba-4b87-aafc-a96a98df7b3c\") " pod="openshift-marketplace/redhat-operators-ft6p9" Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.036658 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae5aaed6-76ba-4b87-aafc-a96a98df7b3c-catalog-content\") pod \"redhat-operators-ft6p9\" (UID: \"ae5aaed6-76ba-4b87-aafc-a96a98df7b3c\") " pod="openshift-marketplace/redhat-operators-ft6p9" Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.062192 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvk6f\" (UniqueName: \"kubernetes.io/projected/ae5aaed6-76ba-4b87-aafc-a96a98df7b3c-kube-api-access-jvk6f\") pod \"redhat-operators-ft6p9\" (UID: \"ae5aaed6-76ba-4b87-aafc-a96a98df7b3c\") " pod="openshift-marketplace/redhat-operators-ft6p9" Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.100940 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ft6p9" Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.116910 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fkfh"] Jan 28 06:54:57 crc kubenswrapper[4776]: W0128 06:54:57.176494 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4972ca48_b9b2_4811_9d6a_15aef7b4a2c1.slice/crio-b60dd3bea2da835292ac765b7a0aca9659b5ee2dc87156627ebcd53d3195908a WatchSource:0}: Error finding container b60dd3bea2da835292ac765b7a0aca9659b5ee2dc87156627ebcd53d3195908a: Status 404 returned error can't find the container with id b60dd3bea2da835292ac765b7a0aca9659b5ee2dc87156627ebcd53d3195908a Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.293779 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ft6p9"] Jan 28 06:54:57 crc kubenswrapper[4776]: W0128 06:54:57.303529 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae5aaed6_76ba_4b87_aafc_a96a98df7b3c.slice/crio-b00b499739aff2b5a30be1c2e88a6cd05d959609d9abaa3a20787b2cf499fc7f WatchSource:0}: Error finding container b00b499739aff2b5a30be1c2e88a6cd05d959609d9abaa3a20787b2cf499fc7f: Status 404 returned error can't find the container with id b00b499739aff2b5a30be1c2e88a6cd05d959609d9abaa3a20787b2cf499fc7f Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.885686 4776 generic.go:334] "Generic (PLEG): container finished" podID="4972ca48-b9b2-4811-9d6a-15aef7b4a2c1" containerID="7ef0095c10802bb85c29bf0c33560df3d46d265e41f733509210df00ca140b9a" exitCode=0 Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.885745 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fkfh" event={"ID":"4972ca48-b9b2-4811-9d6a-15aef7b4a2c1","Type":"ContainerDied","Data":"7ef0095c10802bb85c29bf0c33560df3d46d265e41f733509210df00ca140b9a"} Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.885816 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fkfh" event={"ID":"4972ca48-b9b2-4811-9d6a-15aef7b4a2c1","Type":"ContainerStarted","Data":"b60dd3bea2da835292ac765b7a0aca9659b5ee2dc87156627ebcd53d3195908a"} Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.887610 4776 generic.go:334] "Generic (PLEG): container finished" podID="fbe549a5-e8f7-4868-9420-f64ff851880e" containerID="f4d509684a872610b3c457e40c9c0f5f25e0b2fdca819c50ea3b8c31b51b2ad8" exitCode=0 Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.887671 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfr2v" event={"ID":"fbe549a5-e8f7-4868-9420-f64ff851880e","Type":"ContainerDied","Data":"f4d509684a872610b3c457e40c9c0f5f25e0b2fdca819c50ea3b8c31b51b2ad8"} Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.890334 4776 generic.go:334] "Generic (PLEG): container finished" podID="ae5aaed6-76ba-4b87-aafc-a96a98df7b3c" containerID="c291e9266ab6b8cf47f0c20f6200a6743a874fb391b7c3499d5b5b150e002f14" exitCode=0 Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.890408 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ft6p9" event={"ID":"ae5aaed6-76ba-4b87-aafc-a96a98df7b3c","Type":"ContainerDied","Data":"c291e9266ab6b8cf47f0c20f6200a6743a874fb391b7c3499d5b5b150e002f14"} Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.890430 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ft6p9" event={"ID":"ae5aaed6-76ba-4b87-aafc-a96a98df7b3c","Type":"ContainerStarted","Data":"b00b499739aff2b5a30be1c2e88a6cd05d959609d9abaa3a20787b2cf499fc7f"} Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.894712 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x677g" event={"ID":"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5","Type":"ContainerStarted","Data":"b19529727f27f20eabe1df5fca283e6dbd879334f832f105eb9cade7f114efea"} Jan 28 06:54:57 crc kubenswrapper[4776]: I0128 06:54:57.924234 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x677g" podStartSLOduration=2.448718372 podStartE2EDuration="3.924209628s" podCreationTimestamp="2026-01-28 06:54:54 +0000 UTC" firstStartedPulling="2026-01-28 06:54:55.863134405 +0000 UTC m=+267.278794565" lastFinishedPulling="2026-01-28 06:54:57.338625661 +0000 UTC m=+268.754285821" observedRunningTime="2026-01-28 06:54:57.921602078 +0000 UTC m=+269.337262238" watchObservedRunningTime="2026-01-28 06:54:57.924209628 +0000 UTC m=+269.339869808" Jan 28 06:54:58 crc kubenswrapper[4776]: I0128 06:54:58.901889 4776 generic.go:334] "Generic (PLEG): container finished" podID="4972ca48-b9b2-4811-9d6a-15aef7b4a2c1" containerID="76e91b40612a1599cc7b84c96185499d6cd1fb2e1e7bfd9fd286f266de1e2dd1" exitCode=0 Jan 28 06:54:58 crc kubenswrapper[4776]: I0128 06:54:58.901941 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fkfh" event={"ID":"4972ca48-b9b2-4811-9d6a-15aef7b4a2c1","Type":"ContainerDied","Data":"76e91b40612a1599cc7b84c96185499d6cd1fb2e1e7bfd9fd286f266de1e2dd1"} Jan 28 06:54:58 crc kubenswrapper[4776]: I0128 06:54:58.907437 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfr2v" event={"ID":"fbe549a5-e8f7-4868-9420-f64ff851880e","Type":"ContainerStarted","Data":"81b0d802a83ac4cfccc10871f6f7ef1473e1427b30b06e17bc8fc73906f1e636"} Jan 28 06:54:58 crc kubenswrapper[4776]: I0128 06:54:58.910021 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ft6p9" event={"ID":"ae5aaed6-76ba-4b87-aafc-a96a98df7b3c","Type":"ContainerStarted","Data":"60f6c1bd935cc5a56de7fae6aba39f7aad04128aaac0dc46bfa11d5a1a19ec19"} Jan 28 06:54:58 crc kubenswrapper[4776]: I0128 06:54:58.956284 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sfr2v" podStartSLOduration=2.505014018 podStartE2EDuration="4.956266199s" podCreationTimestamp="2026-01-28 06:54:54 +0000 UTC" firstStartedPulling="2026-01-28 06:54:55.85918229 +0000 UTC m=+267.274842450" lastFinishedPulling="2026-01-28 06:54:58.310434471 +0000 UTC m=+269.726094631" observedRunningTime="2026-01-28 06:54:58.95330542 +0000 UTC m=+270.368965580" watchObservedRunningTime="2026-01-28 06:54:58.956266199 +0000 UTC m=+270.371926359" Jan 28 06:54:59 crc kubenswrapper[4776]: I0128 06:54:59.918670 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fkfh" event={"ID":"4972ca48-b9b2-4811-9d6a-15aef7b4a2c1","Type":"ContainerStarted","Data":"81fb47d49c3df69180962f6bf3216f6c674e87eaa338c27771b432fa52656117"} Jan 28 06:54:59 crc kubenswrapper[4776]: I0128 06:54:59.922340 4776 generic.go:334] "Generic (PLEG): container finished" podID="ae5aaed6-76ba-4b87-aafc-a96a98df7b3c" containerID="60f6c1bd935cc5a56de7fae6aba39f7aad04128aaac0dc46bfa11d5a1a19ec19" exitCode=0 Jan 28 06:54:59 crc kubenswrapper[4776]: I0128 06:54:59.923292 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ft6p9" event={"ID":"ae5aaed6-76ba-4b87-aafc-a96a98df7b3c","Type":"ContainerDied","Data":"60f6c1bd935cc5a56de7fae6aba39f7aad04128aaac0dc46bfa11d5a1a19ec19"} Jan 28 06:54:59 crc kubenswrapper[4776]: I0128 06:54:59.939710 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6fkfh" podStartSLOduration=2.491183777 podStartE2EDuration="3.93969007s" podCreationTimestamp="2026-01-28 06:54:56 +0000 UTC" firstStartedPulling="2026-01-28 06:54:57.888974885 +0000 UTC m=+269.304635045" lastFinishedPulling="2026-01-28 06:54:59.337481178 +0000 UTC m=+270.753141338" observedRunningTime="2026-01-28 06:54:59.936818563 +0000 UTC m=+271.352478723" watchObservedRunningTime="2026-01-28 06:54:59.93969007 +0000 UTC m=+271.355350230" Jan 28 06:55:00 crc kubenswrapper[4776]: I0128 06:55:00.930485 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ft6p9" event={"ID":"ae5aaed6-76ba-4b87-aafc-a96a98df7b3c","Type":"ContainerStarted","Data":"c0e6b5d18bd7a1f6948632217a9055a9968c04edad288cc7d8fe5c4f3a680188"} Jan 28 06:55:00 crc kubenswrapper[4776]: I0128 06:55:00.951600 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ft6p9" podStartSLOduration=2.471398986 podStartE2EDuration="4.951575192s" podCreationTimestamp="2026-01-28 06:54:56 +0000 UTC" firstStartedPulling="2026-01-28 06:54:57.89179655 +0000 UTC m=+269.307456720" lastFinishedPulling="2026-01-28 06:55:00.371972766 +0000 UTC m=+271.787632926" observedRunningTime="2026-01-28 06:55:00.943879077 +0000 UTC m=+272.359539247" watchObservedRunningTime="2026-01-28 06:55:00.951575192 +0000 UTC m=+272.367235362" Jan 28 06:55:04 crc kubenswrapper[4776]: I0128 06:55:04.503388 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sfr2v" Jan 28 06:55:04 crc kubenswrapper[4776]: I0128 06:55:04.504010 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sfr2v" Jan 28 06:55:04 crc kubenswrapper[4776]: I0128 06:55:04.555929 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sfr2v" Jan 28 06:55:04 crc kubenswrapper[4776]: I0128 06:55:04.666419 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x677g" Jan 28 06:55:04 crc kubenswrapper[4776]: I0128 06:55:04.666507 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x677g" Jan 28 06:55:04 crc kubenswrapper[4776]: I0128 06:55:04.708900 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x677g" Jan 28 06:55:05 crc kubenswrapper[4776]: I0128 06:55:05.008742 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sfr2v" Jan 28 06:55:05 crc kubenswrapper[4776]: I0128 06:55:05.013417 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x677g" Jan 28 06:55:06 crc kubenswrapper[4776]: I0128 06:55:06.884446 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6fkfh" Jan 28 06:55:06 crc kubenswrapper[4776]: I0128 06:55:06.885070 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6fkfh" Jan 28 06:55:06 crc kubenswrapper[4776]: I0128 06:55:06.937906 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6fkfh" Jan 28 06:55:07 crc kubenswrapper[4776]: I0128 06:55:07.012391 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6fkfh" Jan 28 06:55:07 crc kubenswrapper[4776]: I0128 06:55:07.101961 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ft6p9" Jan 28 06:55:07 crc kubenswrapper[4776]: I0128 06:55:07.101999 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ft6p9" Jan 28 06:55:07 crc kubenswrapper[4776]: I0128 06:55:07.137228 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ft6p9" Jan 28 06:55:08 crc kubenswrapper[4776]: I0128 06:55:08.022959 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ft6p9" Jan 28 06:55:09 crc kubenswrapper[4776]: I0128 06:55:09.117796 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-cg6fj" Jan 28 06:55:09 crc kubenswrapper[4776]: I0128 06:55:09.205246 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vjkh5"] Jan 28 06:55:29 crc kubenswrapper[4776]: I0128 06:55:29.088346 4776 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.261158 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" podUID="57b5f9b9-549e-443e-9fc5-eb377698f57b" containerName="registry" containerID="cri-o://38cf452848bbdc39b20e13a8fdb6a2287fd25915cb2826b6844eea83048b826e" gracePeriod=30 Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.609655 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.703552 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"57b5f9b9-549e-443e-9fc5-eb377698f57b\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.704014 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57b5f9b9-549e-443e-9fc5-eb377698f57b-trusted-ca\") pod \"57b5f9b9-549e-443e-9fc5-eb377698f57b\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.704105 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-registry-tls\") pod \"57b5f9b9-549e-443e-9fc5-eb377698f57b\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.704182 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57b5f9b9-549e-443e-9fc5-eb377698f57b-installation-pull-secrets\") pod \"57b5f9b9-549e-443e-9fc5-eb377698f57b\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.704268 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-bound-sa-token\") pod \"57b5f9b9-549e-443e-9fc5-eb377698f57b\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.704347 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57b5f9b9-549e-443e-9fc5-eb377698f57b-ca-trust-extracted\") pod \"57b5f9b9-549e-443e-9fc5-eb377698f57b\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.704451 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw2wp\" (UniqueName: \"kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-kube-api-access-fw2wp\") pod \"57b5f9b9-549e-443e-9fc5-eb377698f57b\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.704543 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57b5f9b9-549e-443e-9fc5-eb377698f57b-registry-certificates\") pod \"57b5f9b9-549e-443e-9fc5-eb377698f57b\" (UID: \"57b5f9b9-549e-443e-9fc5-eb377698f57b\") " Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.705351 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57b5f9b9-549e-443e-9fc5-eb377698f57b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "57b5f9b9-549e-443e-9fc5-eb377698f57b" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.705469 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57b5f9b9-549e-443e-9fc5-eb377698f57b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "57b5f9b9-549e-443e-9fc5-eb377698f57b" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.713497 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "57b5f9b9-549e-443e-9fc5-eb377698f57b" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.713959 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-kube-api-access-fw2wp" (OuterVolumeSpecName: "kube-api-access-fw2wp") pod "57b5f9b9-549e-443e-9fc5-eb377698f57b" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b"). InnerVolumeSpecName "kube-api-access-fw2wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.714216 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "57b5f9b9-549e-443e-9fc5-eb377698f57b" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.717020 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57b5f9b9-549e-443e-9fc5-eb377698f57b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "57b5f9b9-549e-443e-9fc5-eb377698f57b" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.717886 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "57b5f9b9-549e-443e-9fc5-eb377698f57b" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.721186 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57b5f9b9-549e-443e-9fc5-eb377698f57b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "57b5f9b9-549e-443e-9fc5-eb377698f57b" (UID: "57b5f9b9-549e-443e-9fc5-eb377698f57b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.806411 4776 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.806465 4776 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57b5f9b9-549e-443e-9fc5-eb377698f57b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.806482 4776 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.806495 4776 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57b5f9b9-549e-443e-9fc5-eb377698f57b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.806508 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw2wp\" (UniqueName: \"kubernetes.io/projected/57b5f9b9-549e-443e-9fc5-eb377698f57b-kube-api-access-fw2wp\") on node \"crc\" DevicePath \"\"" Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.806520 4776 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57b5f9b9-549e-443e-9fc5-eb377698f57b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 28 06:55:34 crc kubenswrapper[4776]: I0128 06:55:34.806531 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57b5f9b9-549e-443e-9fc5-eb377698f57b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 06:55:35 crc kubenswrapper[4776]: I0128 06:55:35.143844 4776 generic.go:334] "Generic (PLEG): container finished" podID="57b5f9b9-549e-443e-9fc5-eb377698f57b" containerID="38cf452848bbdc39b20e13a8fdb6a2287fd25915cb2826b6844eea83048b826e" exitCode=0 Jan 28 06:55:35 crc kubenswrapper[4776]: I0128 06:55:35.143905 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" event={"ID":"57b5f9b9-549e-443e-9fc5-eb377698f57b","Type":"ContainerDied","Data":"38cf452848bbdc39b20e13a8fdb6a2287fd25915cb2826b6844eea83048b826e"} Jan 28 06:55:35 crc kubenswrapper[4776]: I0128 06:55:35.143906 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" Jan 28 06:55:35 crc kubenswrapper[4776]: I0128 06:55:35.143944 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vjkh5" event={"ID":"57b5f9b9-549e-443e-9fc5-eb377698f57b","Type":"ContainerDied","Data":"b1f14d6c56a8063e11ec0edf990bbc3f96aee7d3f5f3475483d7186738ae1e77"} Jan 28 06:55:35 crc kubenswrapper[4776]: I0128 06:55:35.143968 4776 scope.go:117] "RemoveContainer" containerID="38cf452848bbdc39b20e13a8fdb6a2287fd25915cb2826b6844eea83048b826e" Jan 28 06:55:35 crc kubenswrapper[4776]: I0128 06:55:35.177958 4776 scope.go:117] "RemoveContainer" containerID="38cf452848bbdc39b20e13a8fdb6a2287fd25915cb2826b6844eea83048b826e" Jan 28 06:55:35 crc kubenswrapper[4776]: E0128 06:55:35.180584 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38cf452848bbdc39b20e13a8fdb6a2287fd25915cb2826b6844eea83048b826e\": container with ID starting with 38cf452848bbdc39b20e13a8fdb6a2287fd25915cb2826b6844eea83048b826e not found: ID does not exist" containerID="38cf452848bbdc39b20e13a8fdb6a2287fd25915cb2826b6844eea83048b826e" Jan 28 06:55:35 crc kubenswrapper[4776]: I0128 06:55:35.180667 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38cf452848bbdc39b20e13a8fdb6a2287fd25915cb2826b6844eea83048b826e"} err="failed to get container status \"38cf452848bbdc39b20e13a8fdb6a2287fd25915cb2826b6844eea83048b826e\": rpc error: code = NotFound desc = could not find container \"38cf452848bbdc39b20e13a8fdb6a2287fd25915cb2826b6844eea83048b826e\": container with ID starting with 38cf452848bbdc39b20e13a8fdb6a2287fd25915cb2826b6844eea83048b826e not found: ID does not exist" Jan 28 06:55:35 crc kubenswrapper[4776]: I0128 06:55:35.181242 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vjkh5"] Jan 28 06:55:35 crc kubenswrapper[4776]: I0128 06:55:35.184360 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vjkh5"] Jan 28 06:55:35 crc kubenswrapper[4776]: I0128 06:55:35.315672 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57b5f9b9-549e-443e-9fc5-eb377698f57b" path="/var/lib/kubelet/pods/57b5f9b9-549e-443e-9fc5-eb377698f57b/volumes" Jan 28 06:56:33 crc kubenswrapper[4776]: I0128 06:56:33.851691 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 06:56:33 crc kubenswrapper[4776]: I0128 06:56:33.852343 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 06:56:35 crc kubenswrapper[4776]: E0128 06:56:35.261523 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/NetworkManager-dispatcher.service\": RecentStats: unable to find data in memory cache]" Jan 28 06:57:03 crc kubenswrapper[4776]: I0128 06:57:03.852371 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 06:57:03 crc kubenswrapper[4776]: I0128 06:57:03.853418 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 06:57:33 crc kubenswrapper[4776]: I0128 06:57:33.852643 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 06:57:33 crc kubenswrapper[4776]: I0128 06:57:33.853585 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 06:57:33 crc kubenswrapper[4776]: I0128 06:57:33.853645 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 06:57:33 crc kubenswrapper[4776]: I0128 06:57:33.854452 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f5691c3966fa8bfeca0b7c2a14453cfb77e0738ae741fdd437e2c637ddfe5d3c"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 06:57:33 crc kubenswrapper[4776]: I0128 06:57:33.854589 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://f5691c3966fa8bfeca0b7c2a14453cfb77e0738ae741fdd437e2c637ddfe5d3c" gracePeriod=600 Jan 28 06:57:34 crc kubenswrapper[4776]: I0128 06:57:34.897416 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="f5691c3966fa8bfeca0b7c2a14453cfb77e0738ae741fdd437e2c637ddfe5d3c" exitCode=0 Jan 28 06:57:34 crc kubenswrapper[4776]: I0128 06:57:34.897517 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"f5691c3966fa8bfeca0b7c2a14453cfb77e0738ae741fdd437e2c637ddfe5d3c"} Jan 28 06:57:34 crc kubenswrapper[4776]: I0128 06:57:34.898195 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"de406494f5986cb272819651fdda864d086b81af18822e3493914679a641f0e0"} Jan 28 06:57:34 crc kubenswrapper[4776]: I0128 06:57:34.898225 4776 scope.go:117] "RemoveContainer" containerID="51dafa461a70027f364e6aa027f4d3a2909a6ea9aa4d2e44f29d1621241c154c" Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.161850 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x"] Jan 28 07:00:00 crc kubenswrapper[4776]: E0128 07:00:00.162711 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b5f9b9-549e-443e-9fc5-eb377698f57b" containerName="registry" Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.162729 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b5f9b9-549e-443e-9fc5-eb377698f57b" containerName="registry" Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.162862 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="57b5f9b9-549e-443e-9fc5-eb377698f57b" containerName="registry" Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.163386 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x" Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.165641 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.167580 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x"] Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.170417 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.329318 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-config-volume\") pod \"collect-profiles-29493060-2px6x\" (UID: \"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x" Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.329387 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87q5t\" (UniqueName: \"kubernetes.io/projected/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-kube-api-access-87q5t\") pod \"collect-profiles-29493060-2px6x\" (UID: \"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x" Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.329424 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-secret-volume\") pod \"collect-profiles-29493060-2px6x\" (UID: \"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x" Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.431000 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-config-volume\") pod \"collect-profiles-29493060-2px6x\" (UID: \"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x" Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.431075 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87q5t\" (UniqueName: \"kubernetes.io/projected/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-kube-api-access-87q5t\") pod \"collect-profiles-29493060-2px6x\" (UID: \"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x" Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.431113 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-secret-volume\") pod \"collect-profiles-29493060-2px6x\" (UID: \"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x" Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.432089 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-config-volume\") pod \"collect-profiles-29493060-2px6x\" (UID: \"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x" Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.436514 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-secret-volume\") pod \"collect-profiles-29493060-2px6x\" (UID: \"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x" Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.447027 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87q5t\" (UniqueName: \"kubernetes.io/projected/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-kube-api-access-87q5t\") pod \"collect-profiles-29493060-2px6x\" (UID: \"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x" Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.479332 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x" Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.864449 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x"] Jan 28 07:00:00 crc kubenswrapper[4776]: I0128 07:00:00.905522 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x" event={"ID":"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0","Type":"ContainerStarted","Data":"733104c64249ebf750dae7c7ba3bd13fc68a6ca4b7b93109b015a40ea3898ab8"} Jan 28 07:00:01 crc kubenswrapper[4776]: I0128 07:00:01.916721 4776 generic.go:334] "Generic (PLEG): container finished" podID="2b42b4d1-1d09-49c6-bbb9-e4c0370554c0" containerID="19289311f5ffbf2895f689d8cc6011b96409df6687327543c8108b6d1cc218de" exitCode=0 Jan 28 07:00:01 crc kubenswrapper[4776]: I0128 07:00:01.916799 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x" event={"ID":"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0","Type":"ContainerDied","Data":"19289311f5ffbf2895f689d8cc6011b96409df6687327543c8108b6d1cc218de"} Jan 28 07:00:03 crc kubenswrapper[4776]: I0128 07:00:03.185703 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x" Jan 28 07:00:03 crc kubenswrapper[4776]: I0128 07:00:03.266611 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-config-volume\") pod \"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0\" (UID: \"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0\") " Jan 28 07:00:03 crc kubenswrapper[4776]: I0128 07:00:03.266697 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87q5t\" (UniqueName: \"kubernetes.io/projected/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-kube-api-access-87q5t\") pod \"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0\" (UID: \"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0\") " Jan 28 07:00:03 crc kubenswrapper[4776]: I0128 07:00:03.266836 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-secret-volume\") pod \"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0\" (UID: \"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0\") " Jan 28 07:00:03 crc kubenswrapper[4776]: I0128 07:00:03.267756 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-config-volume" (OuterVolumeSpecName: "config-volume") pod "2b42b4d1-1d09-49c6-bbb9-e4c0370554c0" (UID: "2b42b4d1-1d09-49c6-bbb9-e4c0370554c0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:00:03 crc kubenswrapper[4776]: I0128 07:00:03.272694 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2b42b4d1-1d09-49c6-bbb9-e4c0370554c0" (UID: "2b42b4d1-1d09-49c6-bbb9-e4c0370554c0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:00:03 crc kubenswrapper[4776]: I0128 07:00:03.273082 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-kube-api-access-87q5t" (OuterVolumeSpecName: "kube-api-access-87q5t") pod "2b42b4d1-1d09-49c6-bbb9-e4c0370554c0" (UID: "2b42b4d1-1d09-49c6-bbb9-e4c0370554c0"). InnerVolumeSpecName "kube-api-access-87q5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:00:03 crc kubenswrapper[4776]: I0128 07:00:03.368925 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:03 crc kubenswrapper[4776]: I0128 07:00:03.368979 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87q5t\" (UniqueName: \"kubernetes.io/projected/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-kube-api-access-87q5t\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:03 crc kubenswrapper[4776]: I0128 07:00:03.369001 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:03 crc kubenswrapper[4776]: I0128 07:00:03.852267 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:00:03 crc kubenswrapper[4776]: I0128 07:00:03.852338 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:00:03 crc kubenswrapper[4776]: I0128 07:00:03.931943 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x" event={"ID":"2b42b4d1-1d09-49c6-bbb9-e4c0370554c0","Type":"ContainerDied","Data":"733104c64249ebf750dae7c7ba3bd13fc68a6ca4b7b93109b015a40ea3898ab8"} Jan 28 07:00:03 crc kubenswrapper[4776]: I0128 07:00:03.932011 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="733104c64249ebf750dae7c7ba3bd13fc68a6ca4b7b93109b015a40ea3898ab8" Jan 28 07:00:03 crc kubenswrapper[4776]: I0128 07:00:03.932013 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x" Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.800523 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-jrkz5"] Jan 28 07:00:08 crc kubenswrapper[4776]: E0128 07:00:08.801012 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b42b4d1-1d09-49c6-bbb9-e4c0370554c0" containerName="collect-profiles" Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.801023 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b42b4d1-1d09-49c6-bbb9-e4c0370554c0" containerName="collect-profiles" Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.801122 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b42b4d1-1d09-49c6-bbb9-e4c0370554c0" containerName="collect-profiles" Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.801470 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-jrkz5" Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.803703 4776 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-2s952" Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.803957 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.803980 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.804882 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-kh746"] Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.809251 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-kh746" Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.811197 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-jrkz5"] Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.843512 4776 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-x8hhg" Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.847991 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-kh746"] Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.853681 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-5lrjh"] Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.854260 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-5lrjh" Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.856536 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-5lrjh"] Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.857476 4776 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pwsr6" Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.944695 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbhb7\" (UniqueName: \"kubernetes.io/projected/ebf51615-2906-4bc1-9224-7bdc14f6afa6-kube-api-access-fbhb7\") pod \"cert-manager-858654f9db-kh746\" (UID: \"ebf51615-2906-4bc1-9224-7bdc14f6afa6\") " pod="cert-manager/cert-manager-858654f9db-kh746" Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.944854 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb894\" (UniqueName: \"kubernetes.io/projected/8c08bbc8-20fd-452e-8d53-4baa6ac41fc2-kube-api-access-kb894\") pod \"cert-manager-cainjector-cf98fcc89-jrkz5\" (UID: \"8c08bbc8-20fd-452e-8d53-4baa6ac41fc2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-jrkz5" Jan 28 07:00:08 crc kubenswrapper[4776]: I0128 07:00:08.944931 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2626l\" (UniqueName: \"kubernetes.io/projected/24f93919-f6ec-481d-b6f3-0bfd6fdb7e01-kube-api-access-2626l\") pod \"cert-manager-webhook-687f57d79b-5lrjh\" (UID: \"24f93919-f6ec-481d-b6f3-0bfd6fdb7e01\") " pod="cert-manager/cert-manager-webhook-687f57d79b-5lrjh" Jan 28 07:00:09 crc kubenswrapper[4776]: I0128 07:00:09.046064 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2626l\" (UniqueName: \"kubernetes.io/projected/24f93919-f6ec-481d-b6f3-0bfd6fdb7e01-kube-api-access-2626l\") pod \"cert-manager-webhook-687f57d79b-5lrjh\" (UID: \"24f93919-f6ec-481d-b6f3-0bfd6fdb7e01\") " pod="cert-manager/cert-manager-webhook-687f57d79b-5lrjh" Jan 28 07:00:09 crc kubenswrapper[4776]: I0128 07:00:09.046142 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbhb7\" (UniqueName: \"kubernetes.io/projected/ebf51615-2906-4bc1-9224-7bdc14f6afa6-kube-api-access-fbhb7\") pod \"cert-manager-858654f9db-kh746\" (UID: \"ebf51615-2906-4bc1-9224-7bdc14f6afa6\") " pod="cert-manager/cert-manager-858654f9db-kh746" Jan 28 07:00:09 crc kubenswrapper[4776]: I0128 07:00:09.046191 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb894\" (UniqueName: \"kubernetes.io/projected/8c08bbc8-20fd-452e-8d53-4baa6ac41fc2-kube-api-access-kb894\") pod \"cert-manager-cainjector-cf98fcc89-jrkz5\" (UID: \"8c08bbc8-20fd-452e-8d53-4baa6ac41fc2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-jrkz5" Jan 28 07:00:09 crc kubenswrapper[4776]: I0128 07:00:09.076161 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2626l\" (UniqueName: \"kubernetes.io/projected/24f93919-f6ec-481d-b6f3-0bfd6fdb7e01-kube-api-access-2626l\") pod \"cert-manager-webhook-687f57d79b-5lrjh\" (UID: \"24f93919-f6ec-481d-b6f3-0bfd6fdb7e01\") " pod="cert-manager/cert-manager-webhook-687f57d79b-5lrjh" Jan 28 07:00:09 crc kubenswrapper[4776]: I0128 07:00:09.076410 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbhb7\" (UniqueName: \"kubernetes.io/projected/ebf51615-2906-4bc1-9224-7bdc14f6afa6-kube-api-access-fbhb7\") pod \"cert-manager-858654f9db-kh746\" (UID: \"ebf51615-2906-4bc1-9224-7bdc14f6afa6\") " pod="cert-manager/cert-manager-858654f9db-kh746" Jan 28 07:00:09 crc kubenswrapper[4776]: I0128 07:00:09.078477 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb894\" (UniqueName: \"kubernetes.io/projected/8c08bbc8-20fd-452e-8d53-4baa6ac41fc2-kube-api-access-kb894\") pod \"cert-manager-cainjector-cf98fcc89-jrkz5\" (UID: \"8c08bbc8-20fd-452e-8d53-4baa6ac41fc2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-jrkz5" Jan 28 07:00:09 crc kubenswrapper[4776]: I0128 07:00:09.159861 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-jrkz5" Jan 28 07:00:09 crc kubenswrapper[4776]: I0128 07:00:09.167878 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-kh746" Jan 28 07:00:09 crc kubenswrapper[4776]: I0128 07:00:09.173776 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-5lrjh" Jan 28 07:00:09 crc kubenswrapper[4776]: I0128 07:00:09.383914 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-kh746"] Jan 28 07:00:09 crc kubenswrapper[4776]: I0128 07:00:09.396447 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 07:00:09 crc kubenswrapper[4776]: W0128 07:00:09.660899 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24f93919_f6ec_481d_b6f3_0bfd6fdb7e01.slice/crio-adda3ba8b269331e82c11a6391417e0c0968434117a451eb965a1f18495bb2a4 WatchSource:0}: Error finding container adda3ba8b269331e82c11a6391417e0c0968434117a451eb965a1f18495bb2a4: Status 404 returned error can't find the container with id adda3ba8b269331e82c11a6391417e0c0968434117a451eb965a1f18495bb2a4 Jan 28 07:00:09 crc kubenswrapper[4776]: W0128 07:00:09.664513 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c08bbc8_20fd_452e_8d53_4baa6ac41fc2.slice/crio-1ffc2637e8ed1ac5d61fd901455a830da9fc9d9b994b6133db9128e5555356ad WatchSource:0}: Error finding container 1ffc2637e8ed1ac5d61fd901455a830da9fc9d9b994b6133db9128e5555356ad: Status 404 returned error can't find the container with id 1ffc2637e8ed1ac5d61fd901455a830da9fc9d9b994b6133db9128e5555356ad Jan 28 07:00:09 crc kubenswrapper[4776]: I0128 07:00:09.674375 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-5lrjh"] Jan 28 07:00:09 crc kubenswrapper[4776]: I0128 07:00:09.681372 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-jrkz5"] Jan 28 07:00:09 crc kubenswrapper[4776]: I0128 07:00:09.964481 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-5lrjh" event={"ID":"24f93919-f6ec-481d-b6f3-0bfd6fdb7e01","Type":"ContainerStarted","Data":"adda3ba8b269331e82c11a6391417e0c0968434117a451eb965a1f18495bb2a4"} Jan 28 07:00:09 crc kubenswrapper[4776]: I0128 07:00:09.965384 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-kh746" event={"ID":"ebf51615-2906-4bc1-9224-7bdc14f6afa6","Type":"ContainerStarted","Data":"e5414f83e78498e53b13ad6f0112c472493d0c0d733f8b347257b67065677d47"} Jan 28 07:00:09 crc kubenswrapper[4776]: I0128 07:00:09.966685 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-jrkz5" event={"ID":"8c08bbc8-20fd-452e-8d53-4baa6ac41fc2","Type":"ContainerStarted","Data":"1ffc2637e8ed1ac5d61fd901455a830da9fc9d9b994b6133db9128e5555356ad"} Jan 28 07:00:12 crc kubenswrapper[4776]: I0128 07:00:12.986041 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-kh746" event={"ID":"ebf51615-2906-4bc1-9224-7bdc14f6afa6","Type":"ContainerStarted","Data":"5548bf7e3b99107a694faf24d26fb20d4f916f76c08dfde006596ba633d43709"} Jan 28 07:00:13 crc kubenswrapper[4776]: I0128 07:00:13.000462 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-kh746" podStartSLOduration=2.290173233 podStartE2EDuration="5.000439065s" podCreationTimestamp="2026-01-28 07:00:08 +0000 UTC" firstStartedPulling="2026-01-28 07:00:09.396192295 +0000 UTC m=+580.811852465" lastFinishedPulling="2026-01-28 07:00:12.106458137 +0000 UTC m=+583.522118297" observedRunningTime="2026-01-28 07:00:12.997644545 +0000 UTC m=+584.413304735" watchObservedRunningTime="2026-01-28 07:00:13.000439065 +0000 UTC m=+584.416099265" Jan 28 07:00:13 crc kubenswrapper[4776]: I0128 07:00:13.993841 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-jrkz5" event={"ID":"8c08bbc8-20fd-452e-8d53-4baa6ac41fc2","Type":"ContainerStarted","Data":"393cbbe1967889fb44591ae24aa60ca9dabb78c137f7f415fa2a8c2d5134c503"} Jan 28 07:00:13 crc kubenswrapper[4776]: I0128 07:00:13.995622 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-5lrjh" event={"ID":"24f93919-f6ec-481d-b6f3-0bfd6fdb7e01","Type":"ContainerStarted","Data":"9117b47d03549979f917da0a9ddfa531309704944fee131a13ef8e3d10c508b1"} Jan 28 07:00:14 crc kubenswrapper[4776]: I0128 07:00:14.008821 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-jrkz5" podStartSLOduration=2.083602817 podStartE2EDuration="6.00879863s" podCreationTimestamp="2026-01-28 07:00:08 +0000 UTC" firstStartedPulling="2026-01-28 07:00:09.667749991 +0000 UTC m=+581.083410191" lastFinishedPulling="2026-01-28 07:00:13.592945844 +0000 UTC m=+585.008606004" observedRunningTime="2026-01-28 07:00:14.00806713 +0000 UTC m=+585.423727300" watchObservedRunningTime="2026-01-28 07:00:14.00879863 +0000 UTC m=+585.424458780" Jan 28 07:00:14 crc kubenswrapper[4776]: I0128 07:00:14.174899 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-5lrjh" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.259337 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-5lrjh" podStartSLOduration=6.276611664 podStartE2EDuration="10.259314038s" podCreationTimestamp="2026-01-28 07:00:08 +0000 UTC" firstStartedPulling="2026-01-28 07:00:09.662475016 +0000 UTC m=+581.078135176" lastFinishedPulling="2026-01-28 07:00:13.64517738 +0000 UTC m=+585.060837550" observedRunningTime="2026-01-28 07:00:14.040719221 +0000 UTC m=+585.456379411" watchObservedRunningTime="2026-01-28 07:00:18.259314038 +0000 UTC m=+589.674974198" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.260535 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hf24q"] Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.260968 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="ovn-controller" containerID="cri-o://dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808" gracePeriod=30 Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.261007 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="sbdb" containerID="cri-o://f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede" gracePeriod=30 Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.261066 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536" gracePeriod=30 Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.261121 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="nbdb" containerID="cri-o://b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b" gracePeriod=30 Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.261163 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="ovn-acl-logging" containerID="cri-o://cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746" gracePeriod=30 Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.261196 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="kube-rbac-proxy-node" containerID="cri-o://1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07" gracePeriod=30 Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.261184 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="northd" containerID="cri-o://84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b" gracePeriod=30 Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.297071 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="ovnkube-controller" containerID="cri-o://7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05" gracePeriod=30 Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.545241 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hf24q_852d93f4-af9e-413f-8d64-c013edc14dc6/ovn-acl-logging/0.log" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.545770 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hf24q_852d93f4-af9e-413f-8d64-c013edc14dc6/ovn-controller/0.log" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.546296 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.600624 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wp6s9"] Jan 28 07:00:18 crc kubenswrapper[4776]: E0128 07:00:18.600900 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="kube-rbac-proxy-node" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.600919 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="kube-rbac-proxy-node" Jan 28 07:00:18 crc kubenswrapper[4776]: E0128 07:00:18.600953 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="sbdb" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.600959 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="sbdb" Jan 28 07:00:18 crc kubenswrapper[4776]: E0128 07:00:18.600972 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="ovnkube-controller" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.600979 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="ovnkube-controller" Jan 28 07:00:18 crc kubenswrapper[4776]: E0128 07:00:18.600987 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="ovn-controller" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.600993 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="ovn-controller" Jan 28 07:00:18 crc kubenswrapper[4776]: E0128 07:00:18.601002 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="ovn-acl-logging" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.601008 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="ovn-acl-logging" Jan 28 07:00:18 crc kubenswrapper[4776]: E0128 07:00:18.601014 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="nbdb" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.601019 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="nbdb" Jan 28 07:00:18 crc kubenswrapper[4776]: E0128 07:00:18.601026 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="kubecfg-setup" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.601031 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="kubecfg-setup" Jan 28 07:00:18 crc kubenswrapper[4776]: E0128 07:00:18.601039 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="kube-rbac-proxy-ovn-metrics" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.601045 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="kube-rbac-proxy-ovn-metrics" Jan 28 07:00:18 crc kubenswrapper[4776]: E0128 07:00:18.601052 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="northd" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.601058 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="northd" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.601153 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="sbdb" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.601167 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="ovn-controller" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.601175 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="kube-rbac-proxy-ovn-metrics" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.601181 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="northd" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.601188 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="ovnkube-controller" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.601193 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="ovn-acl-logging" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.601201 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="kube-rbac-proxy-node" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.601209 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerName="nbdb" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.602932 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.690836 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-cni-netd\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.690900 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-run-ovn-kubernetes\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691022 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691016 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-cni-bin\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691049 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691096 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-systemd\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691134 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df4q8\" (UniqueName: \"kubernetes.io/projected/852d93f4-af9e-413f-8d64-c013edc14dc6-kube-api-access-df4q8\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691173 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-systemd-units\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691204 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-openvswitch\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691091 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691249 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-log-socket\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691226 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691269 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691296 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-ovnkube-script-lib\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691329 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-slash\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691356 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-log-socket" (OuterVolumeSpecName: "log-socket") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691382 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-node-log\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691419 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-var-lib-openvswitch\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691428 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-slash" (OuterVolumeSpecName: "host-slash") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691455 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/852d93f4-af9e-413f-8d64-c013edc14dc6-ovn-node-metrics-cert\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691468 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-node-log" (OuterVolumeSpecName: "node-log") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691484 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-ovn\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691501 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691533 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-etc-openvswitch\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691633 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691668 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-kubelet\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691694 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-run-netns\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691728 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-ovnkube-config\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691774 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-env-overrides\") pod \"852d93f4-af9e-413f-8d64-c013edc14dc6\" (UID: \"852d93f4-af9e-413f-8d64-c013edc14dc6\") " Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.691981 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.692041 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.692042 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-run-ovn-kubernetes\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.692109 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.692142 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d2f6c039-1340-4810-ac2d-58bb195c9dc2-ovn-node-metrics-cert\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.692196 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-systemd-units\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.692263 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-log-socket\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.692290 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-cni-netd\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.692318 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-run-netns\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.692347 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-run-systemd\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.692609 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-slash\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.692149 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.692162 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.692634 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.692674 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.692857 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.693005 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d2f6c039-1340-4810-ac2d-58bb195c9dc2-env-overrides\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.693184 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-cni-bin\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.693317 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.693491 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-run-openvswitch\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.693573 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d2f6c039-1340-4810-ac2d-58bb195c9dc2-ovnkube-config\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.693647 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d2f6c039-1340-4810-ac2d-58bb195c9dc2-ovnkube-script-lib\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.693702 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-var-lib-openvswitch\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.693762 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-kubelet\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.693809 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54j2k\" (UniqueName: \"kubernetes.io/projected/d2f6c039-1340-4810-ac2d-58bb195c9dc2-kube-api-access-54j2k\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.693849 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-etc-openvswitch\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.693894 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-run-ovn\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.693925 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-node-log\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.694045 4776 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.694077 4776 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.694097 4776 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.694118 4776 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.694136 4776 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.694154 4776 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.694172 4776 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.694192 4776 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.694210 4776 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.694229 4776 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.694247 4776 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-log-socket\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.694302 4776 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/852d93f4-af9e-413f-8d64-c013edc14dc6-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.694322 4776 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-host-slash\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.694341 4776 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-node-log\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.694358 4776 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.694377 4776 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.694396 4776 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.697416 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/852d93f4-af9e-413f-8d64-c013edc14dc6-kube-api-access-df4q8" (OuterVolumeSpecName: "kube-api-access-df4q8") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "kube-api-access-df4q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.697536 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/852d93f4-af9e-413f-8d64-c013edc14dc6-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.705395 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "852d93f4-af9e-413f-8d64-c013edc14dc6" (UID: "852d93f4-af9e-413f-8d64-c013edc14dc6"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.796503 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-kubelet\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.796627 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54j2k\" (UniqueName: \"kubernetes.io/projected/d2f6c039-1340-4810-ac2d-58bb195c9dc2-kube-api-access-54j2k\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.796690 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-etc-openvswitch\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.796689 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-kubelet\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.796745 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-run-ovn\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.796771 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-node-log\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.796790 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-etc-openvswitch\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.796831 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-run-ovn-kubernetes\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.796858 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d2f6c039-1340-4810-ac2d-58bb195c9dc2-ovn-node-metrics-cert\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.796879 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-systemd-units\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.796925 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-log-socket\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.796864 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-node-log\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.796944 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-cni-netd\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.796953 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-run-ovn-kubernetes\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.796992 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-systemd-units\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.796971 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-run-netns\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.796929 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-run-ovn\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797032 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-cni-netd\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797064 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-run-systemd\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797305 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-slash\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797371 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d2f6c039-1340-4810-ac2d-58bb195c9dc2-env-overrides\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797419 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-cni-bin\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797438 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797491 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-run-openvswitch\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797527 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d2f6c039-1340-4810-ac2d-58bb195c9dc2-ovnkube-config\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797602 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d2f6c039-1340-4810-ac2d-58bb195c9dc2-ovnkube-script-lib\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797635 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-var-lib-openvswitch\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797721 4776 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/852d93f4-af9e-413f-8d64-c013edc14dc6-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797735 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-df4q8\" (UniqueName: \"kubernetes.io/projected/852d93f4-af9e-413f-8d64-c013edc14dc6-kube-api-access-df4q8\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797753 4776 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/852d93f4-af9e-413f-8d64-c013edc14dc6-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797787 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-var-lib-openvswitch\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797015 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-log-socket\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797835 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-slash\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797109 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-run-netns\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797911 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797153 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-run-systemd\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.797976 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-host-cni-bin\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.798730 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d2f6c039-1340-4810-ac2d-58bb195c9dc2-env-overrides\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.798932 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d2f6c039-1340-4810-ac2d-58bb195c9dc2-ovnkube-config\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.798966 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2f6c039-1340-4810-ac2d-58bb195c9dc2-run-openvswitch\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.800319 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d2f6c039-1340-4810-ac2d-58bb195c9dc2-ovnkube-script-lib\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.803735 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d2f6c039-1340-4810-ac2d-58bb195c9dc2-ovn-node-metrics-cert\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.819654 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54j2k\" (UniqueName: \"kubernetes.io/projected/d2f6c039-1340-4810-ac2d-58bb195c9dc2-kube-api-access-54j2k\") pod \"ovnkube-node-wp6s9\" (UID: \"d2f6c039-1340-4810-ac2d-58bb195c9dc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:18 crc kubenswrapper[4776]: I0128 07:00:18.918388 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.035083 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hf24q_852d93f4-af9e-413f-8d64-c013edc14dc6/ovn-acl-logging/0.log" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.035868 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hf24q_852d93f4-af9e-413f-8d64-c013edc14dc6/ovn-controller/0.log" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036246 4776 generic.go:334] "Generic (PLEG): container finished" podID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerID="7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05" exitCode=0 Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036276 4776 generic.go:334] "Generic (PLEG): container finished" podID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerID="f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede" exitCode=0 Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036286 4776 generic.go:334] "Generic (PLEG): container finished" podID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerID="b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b" exitCode=0 Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036296 4776 generic.go:334] "Generic (PLEG): container finished" podID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerID="84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b" exitCode=0 Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036305 4776 generic.go:334] "Generic (PLEG): container finished" podID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerID="b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536" exitCode=0 Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036314 4776 generic.go:334] "Generic (PLEG): container finished" podID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerID="1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07" exitCode=0 Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036323 4776 generic.go:334] "Generic (PLEG): container finished" podID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerID="cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746" exitCode=143 Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036331 4776 generic.go:334] "Generic (PLEG): container finished" podID="852d93f4-af9e-413f-8d64-c013edc14dc6" containerID="dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808" exitCode=143 Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036380 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerDied","Data":"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036417 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerDied","Data":"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036432 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerDied","Data":"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036444 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerDied","Data":"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036457 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerDied","Data":"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036470 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerDied","Data":"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036484 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036499 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036509 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036521 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerDied","Data":"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036534 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036575 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036586 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036595 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036603 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036610 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036616 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036623 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036630 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036640 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerDied","Data":"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036653 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036661 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036668 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036675 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036683 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036692 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036699 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036707 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036715 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036726 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" event={"ID":"852d93f4-af9e-413f-8d64-c013edc14dc6","Type":"ContainerDied","Data":"afe2602e536a86aeee2007d0b0a5a8180fc1ebf1536c2ec351181e0b5588e77f"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036737 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036746 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036753 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036762 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036770 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036778 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036786 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036793 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036800 4776 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036711 4776 scope.go:117] "RemoveContainer" containerID="7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.036828 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hf24q" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.040400 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" event={"ID":"d2f6c039-1340-4810-ac2d-58bb195c9dc2","Type":"ContainerStarted","Data":"9271baac572bbe724847db1c681ff7a6d4dd80ae506b4d80b74129e90e2337cb"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.044731 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mng44_fe4cd320-31b6-43af-a080-c8b4855a1a79/kube-multus/0.log" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.044871 4776 generic.go:334] "Generic (PLEG): container finished" podID="fe4cd320-31b6-43af-a080-c8b4855a1a79" containerID="400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d" exitCode=2 Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.044954 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mng44" event={"ID":"fe4cd320-31b6-43af-a080-c8b4855a1a79","Type":"ContainerDied","Data":"400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d"} Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.046308 4776 scope.go:117] "RemoveContainer" containerID="400411410daec755af11d33576b15d49d09ee8dbd6eb4208de1b597f3cff221d" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.087639 4776 scope.go:117] "RemoveContainer" containerID="f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.101955 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hf24q"] Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.105004 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hf24q"] Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.111321 4776 scope.go:117] "RemoveContainer" containerID="b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.177583 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-5lrjh" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.216244 4776 scope.go:117] "RemoveContainer" containerID="84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.248040 4776 scope.go:117] "RemoveContainer" containerID="b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.264531 4776 scope.go:117] "RemoveContainer" containerID="1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.286628 4776 scope.go:117] "RemoveContainer" containerID="cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.309956 4776 scope.go:117] "RemoveContainer" containerID="dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.313176 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="852d93f4-af9e-413f-8d64-c013edc14dc6" path="/var/lib/kubelet/pods/852d93f4-af9e-413f-8d64-c013edc14dc6/volumes" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.331482 4776 scope.go:117] "RemoveContainer" containerID="f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.350408 4776 scope.go:117] "RemoveContainer" containerID="7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05" Jan 28 07:00:19 crc kubenswrapper[4776]: E0128 07:00:19.350871 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\": container with ID starting with 7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05 not found: ID does not exist" containerID="7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.350913 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05"} err="failed to get container status \"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\": rpc error: code = NotFound desc = could not find container \"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\": container with ID starting with 7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.350948 4776 scope.go:117] "RemoveContainer" containerID="f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede" Jan 28 07:00:19 crc kubenswrapper[4776]: E0128 07:00:19.351263 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\": container with ID starting with f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede not found: ID does not exist" containerID="f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.351298 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede"} err="failed to get container status \"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\": rpc error: code = NotFound desc = could not find container \"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\": container with ID starting with f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.351324 4776 scope.go:117] "RemoveContainer" containerID="b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b" Jan 28 07:00:19 crc kubenswrapper[4776]: E0128 07:00:19.351761 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\": container with ID starting with b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b not found: ID does not exist" containerID="b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.351799 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b"} err="failed to get container status \"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\": rpc error: code = NotFound desc = could not find container \"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\": container with ID starting with b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.351820 4776 scope.go:117] "RemoveContainer" containerID="84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b" Jan 28 07:00:19 crc kubenswrapper[4776]: E0128 07:00:19.352116 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\": container with ID starting with 84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b not found: ID does not exist" containerID="84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.352144 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b"} err="failed to get container status \"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\": rpc error: code = NotFound desc = could not find container \"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\": container with ID starting with 84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.352162 4776 scope.go:117] "RemoveContainer" containerID="b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536" Jan 28 07:00:19 crc kubenswrapper[4776]: E0128 07:00:19.352565 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\": container with ID starting with b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536 not found: ID does not exist" containerID="b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.352596 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536"} err="failed to get container status \"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\": rpc error: code = NotFound desc = could not find container \"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\": container with ID starting with b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.352615 4776 scope.go:117] "RemoveContainer" containerID="1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07" Jan 28 07:00:19 crc kubenswrapper[4776]: E0128 07:00:19.353141 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\": container with ID starting with 1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07 not found: ID does not exist" containerID="1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.353169 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07"} err="failed to get container status \"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\": rpc error: code = NotFound desc = could not find container \"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\": container with ID starting with 1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.353187 4776 scope.go:117] "RemoveContainer" containerID="cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746" Jan 28 07:00:19 crc kubenswrapper[4776]: E0128 07:00:19.353505 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746\": container with ID starting with cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746 not found: ID does not exist" containerID="cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.353532 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746"} err="failed to get container status \"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746\": rpc error: code = NotFound desc = could not find container \"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746\": container with ID starting with cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.353589 4776 scope.go:117] "RemoveContainer" containerID="dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808" Jan 28 07:00:19 crc kubenswrapper[4776]: E0128 07:00:19.353920 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808\": container with ID starting with dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808 not found: ID does not exist" containerID="dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.353943 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808"} err="failed to get container status \"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808\": rpc error: code = NotFound desc = could not find container \"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808\": container with ID starting with dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.353979 4776 scope.go:117] "RemoveContainer" containerID="f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27" Jan 28 07:00:19 crc kubenswrapper[4776]: E0128 07:00:19.354266 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\": container with ID starting with f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27 not found: ID does not exist" containerID="f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.354284 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27"} err="failed to get container status \"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\": rpc error: code = NotFound desc = could not find container \"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\": container with ID starting with f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.354298 4776 scope.go:117] "RemoveContainer" containerID="7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.354599 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05"} err="failed to get container status \"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\": rpc error: code = NotFound desc = could not find container \"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\": container with ID starting with 7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.354619 4776 scope.go:117] "RemoveContainer" containerID="f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.354872 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede"} err="failed to get container status \"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\": rpc error: code = NotFound desc = could not find container \"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\": container with ID starting with f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.354894 4776 scope.go:117] "RemoveContainer" containerID="b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.355189 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b"} err="failed to get container status \"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\": rpc error: code = NotFound desc = could not find container \"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\": container with ID starting with b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.355212 4776 scope.go:117] "RemoveContainer" containerID="84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.355520 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b"} err="failed to get container status \"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\": rpc error: code = NotFound desc = could not find container \"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\": container with ID starting with 84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.355561 4776 scope.go:117] "RemoveContainer" containerID="b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.355809 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536"} err="failed to get container status \"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\": rpc error: code = NotFound desc = could not find container \"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\": container with ID starting with b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.355832 4776 scope.go:117] "RemoveContainer" containerID="1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.356137 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07"} err="failed to get container status \"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\": rpc error: code = NotFound desc = could not find container \"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\": container with ID starting with 1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.356165 4776 scope.go:117] "RemoveContainer" containerID="cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.356468 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746"} err="failed to get container status \"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746\": rpc error: code = NotFound desc = could not find container \"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746\": container with ID starting with cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.356489 4776 scope.go:117] "RemoveContainer" containerID="dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.356844 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808"} err="failed to get container status \"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808\": rpc error: code = NotFound desc = could not find container \"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808\": container with ID starting with dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.356880 4776 scope.go:117] "RemoveContainer" containerID="f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.357211 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27"} err="failed to get container status \"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\": rpc error: code = NotFound desc = could not find container \"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\": container with ID starting with f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.357233 4776 scope.go:117] "RemoveContainer" containerID="7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.357474 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05"} err="failed to get container status \"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\": rpc error: code = NotFound desc = could not find container \"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\": container with ID starting with 7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.357498 4776 scope.go:117] "RemoveContainer" containerID="f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.357764 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede"} err="failed to get container status \"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\": rpc error: code = NotFound desc = could not find container \"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\": container with ID starting with f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.357792 4776 scope.go:117] "RemoveContainer" containerID="b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.358081 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b"} err="failed to get container status \"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\": rpc error: code = NotFound desc = could not find container \"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\": container with ID starting with b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.358108 4776 scope.go:117] "RemoveContainer" containerID="84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.358386 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b"} err="failed to get container status \"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\": rpc error: code = NotFound desc = could not find container \"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\": container with ID starting with 84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.358412 4776 scope.go:117] "RemoveContainer" containerID="b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.358652 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536"} err="failed to get container status \"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\": rpc error: code = NotFound desc = could not find container \"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\": container with ID starting with b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.358676 4776 scope.go:117] "RemoveContainer" containerID="1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.358946 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07"} err="failed to get container status \"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\": rpc error: code = NotFound desc = could not find container \"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\": container with ID starting with 1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.358971 4776 scope.go:117] "RemoveContainer" containerID="cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.359213 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746"} err="failed to get container status \"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746\": rpc error: code = NotFound desc = could not find container \"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746\": container with ID starting with cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.359239 4776 scope.go:117] "RemoveContainer" containerID="dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.359462 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808"} err="failed to get container status \"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808\": rpc error: code = NotFound desc = could not find container \"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808\": container with ID starting with dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.359486 4776 scope.go:117] "RemoveContainer" containerID="f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.359727 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27"} err="failed to get container status \"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\": rpc error: code = NotFound desc = could not find container \"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\": container with ID starting with f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.359751 4776 scope.go:117] "RemoveContainer" containerID="7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.360010 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05"} err="failed to get container status \"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\": rpc error: code = NotFound desc = could not find container \"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\": container with ID starting with 7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.360033 4776 scope.go:117] "RemoveContainer" containerID="f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.360300 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede"} err="failed to get container status \"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\": rpc error: code = NotFound desc = could not find container \"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\": container with ID starting with f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.360324 4776 scope.go:117] "RemoveContainer" containerID="b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.360572 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b"} err="failed to get container status \"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\": rpc error: code = NotFound desc = could not find container \"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\": container with ID starting with b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.360603 4776 scope.go:117] "RemoveContainer" containerID="84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.360828 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b"} err="failed to get container status \"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\": rpc error: code = NotFound desc = could not find container \"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\": container with ID starting with 84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.360851 4776 scope.go:117] "RemoveContainer" containerID="b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.361037 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536"} err="failed to get container status \"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\": rpc error: code = NotFound desc = could not find container \"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\": container with ID starting with b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.361057 4776 scope.go:117] "RemoveContainer" containerID="1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.361339 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07"} err="failed to get container status \"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\": rpc error: code = NotFound desc = could not find container \"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\": container with ID starting with 1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.361365 4776 scope.go:117] "RemoveContainer" containerID="cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.361594 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746"} err="failed to get container status \"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746\": rpc error: code = NotFound desc = could not find container \"cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746\": container with ID starting with cf50206937ff99b9a7afb0093fff037f73bf1579b37ad6ed256518eea7ee3746 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.361614 4776 scope.go:117] "RemoveContainer" containerID="dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.361823 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808"} err="failed to get container status \"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808\": rpc error: code = NotFound desc = could not find container \"dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808\": container with ID starting with dc0b25e01a333a2808feaf27ccd449cbfac772b04710e88b8013e1c9e1bac808 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.361847 4776 scope.go:117] "RemoveContainer" containerID="f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.362127 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27"} err="failed to get container status \"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\": rpc error: code = NotFound desc = could not find container \"f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27\": container with ID starting with f4f87dff616b6655163d09d282e0b0d1e2e8b8ef19b6bf05980050aa79c2af27 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.362156 4776 scope.go:117] "RemoveContainer" containerID="7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.362374 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05"} err="failed to get container status \"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\": rpc error: code = NotFound desc = could not find container \"7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05\": container with ID starting with 7885895698c8983c0e8bd6dc6f43b618029029f7a5ba181b255917456ff92e05 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.362398 4776 scope.go:117] "RemoveContainer" containerID="f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.362677 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede"} err="failed to get container status \"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\": rpc error: code = NotFound desc = could not find container \"f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede\": container with ID starting with f98866a8783ad2d556bd17c241f553afdaf1e5044210d930b7ae1cfe08029ede not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.362700 4776 scope.go:117] "RemoveContainer" containerID="b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.362880 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b"} err="failed to get container status \"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\": rpc error: code = NotFound desc = could not find container \"b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b\": container with ID starting with b6599cf2f808a2c8fc42188a1ec13f11f88d5d75395979710a3a4e3662872c5b not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.362915 4776 scope.go:117] "RemoveContainer" containerID="84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.363107 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b"} err="failed to get container status \"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\": rpc error: code = NotFound desc = could not find container \"84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b\": container with ID starting with 84be44fdb07722cff340e44c173f6c844a0e82c9cd466785e26de74e7f0f057b not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.363126 4776 scope.go:117] "RemoveContainer" containerID="b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.363318 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536"} err="failed to get container status \"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\": rpc error: code = NotFound desc = could not find container \"b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536\": container with ID starting with b724edf0268fd836c9d7a717a7fe38779d5a05a74e43e92a9fb0f966d8b9e536 not found: ID does not exist" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.363338 4776 scope.go:117] "RemoveContainer" containerID="1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07" Jan 28 07:00:19 crc kubenswrapper[4776]: I0128 07:00:19.363584 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07"} err="failed to get container status \"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\": rpc error: code = NotFound desc = could not find container \"1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07\": container with ID starting with 1254e5d6b49f1503b23bd8e1d40d9e9faaad37ec6b23cad301774e80a4fd4c07 not found: ID does not exist" Jan 28 07:00:20 crc kubenswrapper[4776]: I0128 07:00:20.052751 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mng44_fe4cd320-31b6-43af-a080-c8b4855a1a79/kube-multus/0.log" Jan 28 07:00:20 crc kubenswrapper[4776]: I0128 07:00:20.052894 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mng44" event={"ID":"fe4cd320-31b6-43af-a080-c8b4855a1a79","Type":"ContainerStarted","Data":"4168072bd6a6441fb20e254ff3cfb12fe18afec7621a51ad64156ac5d62efef5"} Jan 28 07:00:20 crc kubenswrapper[4776]: I0128 07:00:20.055451 4776 generic.go:334] "Generic (PLEG): container finished" podID="d2f6c039-1340-4810-ac2d-58bb195c9dc2" containerID="f2777f60afa1c88c0e13367690905132f44972610bd06adcbfc02aee6ea4ce40" exitCode=0 Jan 28 07:00:20 crc kubenswrapper[4776]: I0128 07:00:20.055523 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" event={"ID":"d2f6c039-1340-4810-ac2d-58bb195c9dc2","Type":"ContainerDied","Data":"f2777f60afa1c88c0e13367690905132f44972610bd06adcbfc02aee6ea4ce40"} Jan 28 07:00:21 crc kubenswrapper[4776]: I0128 07:00:21.067019 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" event={"ID":"d2f6c039-1340-4810-ac2d-58bb195c9dc2","Type":"ContainerStarted","Data":"95d6f8766d24f213506d197ca5d6920642eb319357a11b1fcfc386550329dda7"} Jan 28 07:00:21 crc kubenswrapper[4776]: I0128 07:00:21.067235 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" event={"ID":"d2f6c039-1340-4810-ac2d-58bb195c9dc2","Type":"ContainerStarted","Data":"a9d0d39fb6c95f85e2146e308ac717a351a9b44dd9f77325c656b0483d379108"} Jan 28 07:00:21 crc kubenswrapper[4776]: I0128 07:00:21.067245 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" event={"ID":"d2f6c039-1340-4810-ac2d-58bb195c9dc2","Type":"ContainerStarted","Data":"6755b3eb2306d13515ba2254d3936310a259a3b93b16ed6f4465786ffc9be451"} Jan 28 07:00:21 crc kubenswrapper[4776]: I0128 07:00:21.067253 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" event={"ID":"d2f6c039-1340-4810-ac2d-58bb195c9dc2","Type":"ContainerStarted","Data":"7207d3d420f8935afd65d499b064067bd7538b2a71cddf20b761de5f5ba28bcb"} Jan 28 07:00:21 crc kubenswrapper[4776]: I0128 07:00:21.067260 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" event={"ID":"d2f6c039-1340-4810-ac2d-58bb195c9dc2","Type":"ContainerStarted","Data":"6487d275bbf4b290a1d36a5567732e1831f715219478d1edabd6cd56dbfd9e39"} Jan 28 07:00:21 crc kubenswrapper[4776]: I0128 07:00:21.067268 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" event={"ID":"d2f6c039-1340-4810-ac2d-58bb195c9dc2","Type":"ContainerStarted","Data":"6551d8ac6b19e84539bd834593d2445a4e1a9e782ac1352413068064eacba837"} Jan 28 07:00:24 crc kubenswrapper[4776]: I0128 07:00:24.090365 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" event={"ID":"d2f6c039-1340-4810-ac2d-58bb195c9dc2","Type":"ContainerStarted","Data":"bbb3d8f074b8308692f3c716c804ed0ae67d09907089baeff23e15ddca6025bd"} Jan 28 07:00:26 crc kubenswrapper[4776]: I0128 07:00:26.105640 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" event={"ID":"d2f6c039-1340-4810-ac2d-58bb195c9dc2","Type":"ContainerStarted","Data":"9ac8b57450b4f2c7f5c1b2eb53e0b12f4e7026683e5d9694d14c387aa569447b"} Jan 28 07:00:26 crc kubenswrapper[4776]: I0128 07:00:26.106436 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:26 crc kubenswrapper[4776]: I0128 07:00:26.106517 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:26 crc kubenswrapper[4776]: I0128 07:00:26.106603 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:26 crc kubenswrapper[4776]: I0128 07:00:26.146681 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" podStartSLOduration=8.146665811 podStartE2EDuration="8.146665811s" podCreationTimestamp="2026-01-28 07:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:00:26.141160003 +0000 UTC m=+597.556820163" watchObservedRunningTime="2026-01-28 07:00:26.146665811 +0000 UTC m=+597.562325971" Jan 28 07:00:26 crc kubenswrapper[4776]: I0128 07:00:26.149039 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:26 crc kubenswrapper[4776]: I0128 07:00:26.163739 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:33 crc kubenswrapper[4776]: I0128 07:00:33.852932 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:00:33 crc kubenswrapper[4776]: I0128 07:00:33.853742 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:00:48 crc kubenswrapper[4776]: I0128 07:00:48.945043 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wp6s9" Jan 28 07:00:50 crc kubenswrapper[4776]: I0128 07:00:50.114262 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv"] Jan 28 07:00:50 crc kubenswrapper[4776]: I0128 07:00:50.116411 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" Jan 28 07:00:50 crc kubenswrapper[4776]: I0128 07:00:50.119716 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 28 07:00:50 crc kubenswrapper[4776]: I0128 07:00:50.135005 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv"] Jan 28 07:00:50 crc kubenswrapper[4776]: I0128 07:00:50.273243 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d23dd47-b538-454a-873c-b3cc6b26c92b-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv\" (UID: \"4d23dd47-b538-454a-873c-b3cc6b26c92b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" Jan 28 07:00:50 crc kubenswrapper[4776]: I0128 07:00:50.273295 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d23dd47-b538-454a-873c-b3cc6b26c92b-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv\" (UID: \"4d23dd47-b538-454a-873c-b3cc6b26c92b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" Jan 28 07:00:50 crc kubenswrapper[4776]: I0128 07:00:50.273381 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp7rt\" (UniqueName: \"kubernetes.io/projected/4d23dd47-b538-454a-873c-b3cc6b26c92b-kube-api-access-fp7rt\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv\" (UID: \"4d23dd47-b538-454a-873c-b3cc6b26c92b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" Jan 28 07:00:50 crc kubenswrapper[4776]: I0128 07:00:50.375232 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d23dd47-b538-454a-873c-b3cc6b26c92b-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv\" (UID: \"4d23dd47-b538-454a-873c-b3cc6b26c92b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" Jan 28 07:00:50 crc kubenswrapper[4776]: I0128 07:00:50.375395 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d23dd47-b538-454a-873c-b3cc6b26c92b-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv\" (UID: \"4d23dd47-b538-454a-873c-b3cc6b26c92b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" Jan 28 07:00:50 crc kubenswrapper[4776]: I0128 07:00:50.375610 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp7rt\" (UniqueName: \"kubernetes.io/projected/4d23dd47-b538-454a-873c-b3cc6b26c92b-kube-api-access-fp7rt\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv\" (UID: \"4d23dd47-b538-454a-873c-b3cc6b26c92b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" Jan 28 07:00:50 crc kubenswrapper[4776]: I0128 07:00:50.376282 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d23dd47-b538-454a-873c-b3cc6b26c92b-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv\" (UID: \"4d23dd47-b538-454a-873c-b3cc6b26c92b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" Jan 28 07:00:50 crc kubenswrapper[4776]: I0128 07:00:50.376403 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d23dd47-b538-454a-873c-b3cc6b26c92b-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv\" (UID: \"4d23dd47-b538-454a-873c-b3cc6b26c92b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" Jan 28 07:00:50 crc kubenswrapper[4776]: I0128 07:00:50.411767 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp7rt\" (UniqueName: \"kubernetes.io/projected/4d23dd47-b538-454a-873c-b3cc6b26c92b-kube-api-access-fp7rt\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv\" (UID: \"4d23dd47-b538-454a-873c-b3cc6b26c92b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" Jan 28 07:00:50 crc kubenswrapper[4776]: I0128 07:00:50.440642 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" Jan 28 07:00:50 crc kubenswrapper[4776]: I0128 07:00:50.724014 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv"] Jan 28 07:00:50 crc kubenswrapper[4776]: W0128 07:00:50.737485 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d23dd47_b538_454a_873c_b3cc6b26c92b.slice/crio-aa34d2e1a1e2bc2de32129e806da6425c0a329d80c2a36063dd81ddb47d6e309 WatchSource:0}: Error finding container aa34d2e1a1e2bc2de32129e806da6425c0a329d80c2a36063dd81ddb47d6e309: Status 404 returned error can't find the container with id aa34d2e1a1e2bc2de32129e806da6425c0a329d80c2a36063dd81ddb47d6e309 Jan 28 07:00:51 crc kubenswrapper[4776]: I0128 07:00:51.279834 4776 generic.go:334] "Generic (PLEG): container finished" podID="4d23dd47-b538-454a-873c-b3cc6b26c92b" containerID="60812ae9b5abc9e00f84d2cf4b4f721a63b6494a7ee12f20b8629f22c167ce0f" exitCode=0 Jan 28 07:00:51 crc kubenswrapper[4776]: I0128 07:00:51.279980 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" event={"ID":"4d23dd47-b538-454a-873c-b3cc6b26c92b","Type":"ContainerDied","Data":"60812ae9b5abc9e00f84d2cf4b4f721a63b6494a7ee12f20b8629f22c167ce0f"} Jan 28 07:00:51 crc kubenswrapper[4776]: I0128 07:00:51.280463 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" event={"ID":"4d23dd47-b538-454a-873c-b3cc6b26c92b","Type":"ContainerStarted","Data":"aa34d2e1a1e2bc2de32129e806da6425c0a329d80c2a36063dd81ddb47d6e309"} Jan 28 07:00:53 crc kubenswrapper[4776]: I0128 07:00:53.315941 4776 generic.go:334] "Generic (PLEG): container finished" podID="4d23dd47-b538-454a-873c-b3cc6b26c92b" containerID="cb3d8e09887957d224263ccbdbab241555062d469675f0bfae6a5a0cd756eccc" exitCode=0 Jan 28 07:00:53 crc kubenswrapper[4776]: I0128 07:00:53.316148 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" event={"ID":"4d23dd47-b538-454a-873c-b3cc6b26c92b","Type":"ContainerDied","Data":"cb3d8e09887957d224263ccbdbab241555062d469675f0bfae6a5a0cd756eccc"} Jan 28 07:00:54 crc kubenswrapper[4776]: I0128 07:00:54.326401 4776 generic.go:334] "Generic (PLEG): container finished" podID="4d23dd47-b538-454a-873c-b3cc6b26c92b" containerID="276dc4ba951d85628ca5ec8594198bd6dc708316cd1f4cc7fa4cb60486811436" exitCode=0 Jan 28 07:00:54 crc kubenswrapper[4776]: I0128 07:00:54.326466 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" event={"ID":"4d23dd47-b538-454a-873c-b3cc6b26c92b","Type":"ContainerDied","Data":"276dc4ba951d85628ca5ec8594198bd6dc708316cd1f4cc7fa4cb60486811436"} Jan 28 07:00:55 crc kubenswrapper[4776]: I0128 07:00:55.601954 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" Jan 28 07:00:55 crc kubenswrapper[4776]: I0128 07:00:55.755210 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp7rt\" (UniqueName: \"kubernetes.io/projected/4d23dd47-b538-454a-873c-b3cc6b26c92b-kube-api-access-fp7rt\") pod \"4d23dd47-b538-454a-873c-b3cc6b26c92b\" (UID: \"4d23dd47-b538-454a-873c-b3cc6b26c92b\") " Jan 28 07:00:55 crc kubenswrapper[4776]: I0128 07:00:55.755376 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d23dd47-b538-454a-873c-b3cc6b26c92b-util\") pod \"4d23dd47-b538-454a-873c-b3cc6b26c92b\" (UID: \"4d23dd47-b538-454a-873c-b3cc6b26c92b\") " Jan 28 07:00:55 crc kubenswrapper[4776]: I0128 07:00:55.755509 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d23dd47-b538-454a-873c-b3cc6b26c92b-bundle\") pod \"4d23dd47-b538-454a-873c-b3cc6b26c92b\" (UID: \"4d23dd47-b538-454a-873c-b3cc6b26c92b\") " Jan 28 07:00:55 crc kubenswrapper[4776]: I0128 07:00:55.758146 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d23dd47-b538-454a-873c-b3cc6b26c92b-bundle" (OuterVolumeSpecName: "bundle") pod "4d23dd47-b538-454a-873c-b3cc6b26c92b" (UID: "4d23dd47-b538-454a-873c-b3cc6b26c92b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:00:55 crc kubenswrapper[4776]: I0128 07:00:55.763775 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d23dd47-b538-454a-873c-b3cc6b26c92b-kube-api-access-fp7rt" (OuterVolumeSpecName: "kube-api-access-fp7rt") pod "4d23dd47-b538-454a-873c-b3cc6b26c92b" (UID: "4d23dd47-b538-454a-873c-b3cc6b26c92b"). InnerVolumeSpecName "kube-api-access-fp7rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:00:55 crc kubenswrapper[4776]: I0128 07:00:55.769153 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d23dd47-b538-454a-873c-b3cc6b26c92b-util" (OuterVolumeSpecName: "util") pod "4d23dd47-b538-454a-873c-b3cc6b26c92b" (UID: "4d23dd47-b538-454a-873c-b3cc6b26c92b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:00:55 crc kubenswrapper[4776]: I0128 07:00:55.857131 4776 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d23dd47-b538-454a-873c-b3cc6b26c92b-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:55 crc kubenswrapper[4776]: I0128 07:00:55.857186 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp7rt\" (UniqueName: \"kubernetes.io/projected/4d23dd47-b538-454a-873c-b3cc6b26c92b-kube-api-access-fp7rt\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:55 crc kubenswrapper[4776]: I0128 07:00:55.857207 4776 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d23dd47-b538-454a-873c-b3cc6b26c92b-util\") on node \"crc\" DevicePath \"\"" Jan 28 07:00:56 crc kubenswrapper[4776]: I0128 07:00:56.342084 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" event={"ID":"4d23dd47-b538-454a-873c-b3cc6b26c92b","Type":"ContainerDied","Data":"aa34d2e1a1e2bc2de32129e806da6425c0a329d80c2a36063dd81ddb47d6e309"} Jan 28 07:00:56 crc kubenswrapper[4776]: I0128 07:00:56.342141 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa34d2e1a1e2bc2de32129e806da6425c0a329d80c2a36063dd81ddb47d6e309" Jan 28 07:00:56 crc kubenswrapper[4776]: I0128 07:00:56.342193 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv" Jan 28 07:01:03 crc kubenswrapper[4776]: I0128 07:01:03.852279 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:01:03 crc kubenswrapper[4776]: I0128 07:01:03.853532 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:01:03 crc kubenswrapper[4776]: I0128 07:01:03.853650 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 07:01:03 crc kubenswrapper[4776]: I0128 07:01:03.854952 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de406494f5986cb272819651fdda864d086b81af18822e3493914679a641f0e0"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 07:01:03 crc kubenswrapper[4776]: I0128 07:01:03.855040 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://de406494f5986cb272819651fdda864d086b81af18822e3493914679a641f0e0" gracePeriod=600 Jan 28 07:01:04 crc kubenswrapper[4776]: I0128 07:01:04.387257 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="de406494f5986cb272819651fdda864d086b81af18822e3493914679a641f0e0" exitCode=0 Jan 28 07:01:04 crc kubenswrapper[4776]: I0128 07:01:04.387292 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"de406494f5986cb272819651fdda864d086b81af18822e3493914679a641f0e0"} Jan 28 07:01:04 crc kubenswrapper[4776]: I0128 07:01:04.387784 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"3a4b923d003b08375151203705e90fc5cb4620832d4a2d02a6cb87b79047a42d"} Jan 28 07:01:04 crc kubenswrapper[4776]: I0128 07:01:04.387888 4776 scope.go:117] "RemoveContainer" containerID="f5691c3966fa8bfeca0b7c2a14453cfb77e0738ae741fdd437e2c637ddfe5d3c" Jan 28 07:01:07 crc kubenswrapper[4776]: I0128 07:01:07.939847 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-9v6mf"] Jan 28 07:01:07 crc kubenswrapper[4776]: E0128 07:01:07.942707 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d23dd47-b538-454a-873c-b3cc6b26c92b" containerName="util" Jan 28 07:01:07 crc kubenswrapper[4776]: I0128 07:01:07.942940 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d23dd47-b538-454a-873c-b3cc6b26c92b" containerName="util" Jan 28 07:01:07 crc kubenswrapper[4776]: E0128 07:01:07.943076 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d23dd47-b538-454a-873c-b3cc6b26c92b" containerName="pull" Jan 28 07:01:07 crc kubenswrapper[4776]: I0128 07:01:07.943141 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d23dd47-b538-454a-873c-b3cc6b26c92b" containerName="pull" Jan 28 07:01:07 crc kubenswrapper[4776]: E0128 07:01:07.943217 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d23dd47-b538-454a-873c-b3cc6b26c92b" containerName="extract" Jan 28 07:01:07 crc kubenswrapper[4776]: I0128 07:01:07.943284 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d23dd47-b538-454a-873c-b3cc6b26c92b" containerName="extract" Jan 28 07:01:07 crc kubenswrapper[4776]: I0128 07:01:07.943506 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d23dd47-b538-454a-873c-b3cc6b26c92b" containerName="extract" Jan 28 07:01:07 crc kubenswrapper[4776]: I0128 07:01:07.944271 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9v6mf" Jan 28 07:01:07 crc kubenswrapper[4776]: I0128 07:01:07.947460 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 28 07:01:07 crc kubenswrapper[4776]: I0128 07:01:07.948275 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 28 07:01:07 crc kubenswrapper[4776]: I0128 07:01:07.948304 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-9sts9" Jan 28 07:01:07 crc kubenswrapper[4776]: I0128 07:01:07.961883 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-9v6mf"] Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.005489 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp"] Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.006298 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.008637 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.008911 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-ggpjj" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.015508 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp"] Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.024649 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf"] Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.025660 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.046395 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf"] Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.115437 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f52dbbd1-d020-4074-93eb-706fff6e588b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp\" (UID: \"f52dbbd1-d020-4074-93eb-706fff6e588b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.115844 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/194dde85-71e7-4d74-80c4-59e327ac851a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf\" (UID: \"194dde85-71e7-4d74-80c4-59e327ac851a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.115875 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cxf9\" (UniqueName: \"kubernetes.io/projected/633bf947-38aa-4444-911d-ea2f55433a93-kube-api-access-8cxf9\") pod \"obo-prometheus-operator-68bc856cb9-9v6mf\" (UID: \"633bf947-38aa-4444-911d-ea2f55433a93\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9v6mf" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.115893 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f52dbbd1-d020-4074-93eb-706fff6e588b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp\" (UID: \"f52dbbd1-d020-4074-93eb-706fff6e588b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.115920 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/194dde85-71e7-4d74-80c4-59e327ac851a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf\" (UID: \"194dde85-71e7-4d74-80c4-59e327ac851a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.197213 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-srqjl"] Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.198097 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-srqjl" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.201314 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.201470 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-4cf62" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.217349 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f52dbbd1-d020-4074-93eb-706fff6e588b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp\" (UID: \"f52dbbd1-d020-4074-93eb-706fff6e588b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.217397 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/194dde85-71e7-4d74-80c4-59e327ac851a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf\" (UID: \"194dde85-71e7-4d74-80c4-59e327ac851a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.217434 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cxf9\" (UniqueName: \"kubernetes.io/projected/633bf947-38aa-4444-911d-ea2f55433a93-kube-api-access-8cxf9\") pod \"obo-prometheus-operator-68bc856cb9-9v6mf\" (UID: \"633bf947-38aa-4444-911d-ea2f55433a93\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9v6mf" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.217456 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f52dbbd1-d020-4074-93eb-706fff6e588b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp\" (UID: \"f52dbbd1-d020-4074-93eb-706fff6e588b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.217488 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/194dde85-71e7-4d74-80c4-59e327ac851a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf\" (UID: \"194dde85-71e7-4d74-80c4-59e327ac851a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.227530 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/194dde85-71e7-4d74-80c4-59e327ac851a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf\" (UID: \"194dde85-71e7-4d74-80c4-59e327ac851a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.227642 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f52dbbd1-d020-4074-93eb-706fff6e588b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp\" (UID: \"f52dbbd1-d020-4074-93eb-706fff6e588b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.227571 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f52dbbd1-d020-4074-93eb-706fff6e588b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp\" (UID: \"f52dbbd1-d020-4074-93eb-706fff6e588b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.227882 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/194dde85-71e7-4d74-80c4-59e327ac851a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf\" (UID: \"194dde85-71e7-4d74-80c4-59e327ac851a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.234222 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-srqjl"] Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.256273 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cxf9\" (UniqueName: \"kubernetes.io/projected/633bf947-38aa-4444-911d-ea2f55433a93-kube-api-access-8cxf9\") pod \"obo-prometheus-operator-68bc856cb9-9v6mf\" (UID: \"633bf947-38aa-4444-911d-ea2f55433a93\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9v6mf" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.265493 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9v6mf" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.318511 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/215d3c95-e6d6-4022-a435-f6c30c630727-observability-operator-tls\") pod \"observability-operator-59bdc8b94-srqjl\" (UID: \"215d3c95-e6d6-4022-a435-f6c30c630727\") " pod="openshift-operators/observability-operator-59bdc8b94-srqjl" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.318602 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrbsh\" (UniqueName: \"kubernetes.io/projected/215d3c95-e6d6-4022-a435-f6c30c630727-kube-api-access-qrbsh\") pod \"observability-operator-59bdc8b94-srqjl\" (UID: \"215d3c95-e6d6-4022-a435-f6c30c630727\") " pod="openshift-operators/observability-operator-59bdc8b94-srqjl" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.326651 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.339821 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.356029 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-qd6sh"] Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.356680 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-qd6sh" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.359906 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-nmkkq" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.384628 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-qd6sh"] Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.420250 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/215d3c95-e6d6-4022-a435-f6c30c630727-observability-operator-tls\") pod \"observability-operator-59bdc8b94-srqjl\" (UID: \"215d3c95-e6d6-4022-a435-f6c30c630727\") " pod="openshift-operators/observability-operator-59bdc8b94-srqjl" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.420332 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrbsh\" (UniqueName: \"kubernetes.io/projected/215d3c95-e6d6-4022-a435-f6c30c630727-kube-api-access-qrbsh\") pod \"observability-operator-59bdc8b94-srqjl\" (UID: \"215d3c95-e6d6-4022-a435-f6c30c630727\") " pod="openshift-operators/observability-operator-59bdc8b94-srqjl" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.428557 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/215d3c95-e6d6-4022-a435-f6c30c630727-observability-operator-tls\") pod \"observability-operator-59bdc8b94-srqjl\" (UID: \"215d3c95-e6d6-4022-a435-f6c30c630727\") " pod="openshift-operators/observability-operator-59bdc8b94-srqjl" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.440227 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrbsh\" (UniqueName: \"kubernetes.io/projected/215d3c95-e6d6-4022-a435-f6c30c630727-kube-api-access-qrbsh\") pod \"observability-operator-59bdc8b94-srqjl\" (UID: \"215d3c95-e6d6-4022-a435-f6c30c630727\") " pod="openshift-operators/observability-operator-59bdc8b94-srqjl" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.512426 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-srqjl" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.530999 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f1812216-a0b3-4ae2-9c2c-7086dc74163b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-qd6sh\" (UID: \"f1812216-a0b3-4ae2-9c2c-7086dc74163b\") " pod="openshift-operators/perses-operator-5bf474d74f-qd6sh" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.531136 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmll8\" (UniqueName: \"kubernetes.io/projected/f1812216-a0b3-4ae2-9c2c-7086dc74163b-kube-api-access-dmll8\") pod \"perses-operator-5bf474d74f-qd6sh\" (UID: \"f1812216-a0b3-4ae2-9c2c-7086dc74163b\") " pod="openshift-operators/perses-operator-5bf474d74f-qd6sh" Jan 28 07:01:08 crc kubenswrapper[4776]: W0128 07:01:08.569388 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod633bf947_38aa_4444_911d_ea2f55433a93.slice/crio-ee2ad80220e8ba2da3729ab5833035d1485d7e4d2b1e5c92408490f7936489df WatchSource:0}: Error finding container ee2ad80220e8ba2da3729ab5833035d1485d7e4d2b1e5c92408490f7936489df: Status 404 returned error can't find the container with id ee2ad80220e8ba2da3729ab5833035d1485d7e4d2b1e5c92408490f7936489df Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.571478 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-9v6mf"] Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.635270 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f1812216-a0b3-4ae2-9c2c-7086dc74163b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-qd6sh\" (UID: \"f1812216-a0b3-4ae2-9c2c-7086dc74163b\") " pod="openshift-operators/perses-operator-5bf474d74f-qd6sh" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.635381 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmll8\" (UniqueName: \"kubernetes.io/projected/f1812216-a0b3-4ae2-9c2c-7086dc74163b-kube-api-access-dmll8\") pod \"perses-operator-5bf474d74f-qd6sh\" (UID: \"f1812216-a0b3-4ae2-9c2c-7086dc74163b\") " pod="openshift-operators/perses-operator-5bf474d74f-qd6sh" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.637274 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f1812216-a0b3-4ae2-9c2c-7086dc74163b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-qd6sh\" (UID: \"f1812216-a0b3-4ae2-9c2c-7086dc74163b\") " pod="openshift-operators/perses-operator-5bf474d74f-qd6sh" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.656584 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmll8\" (UniqueName: \"kubernetes.io/projected/f1812216-a0b3-4ae2-9c2c-7086dc74163b-kube-api-access-dmll8\") pod \"perses-operator-5bf474d74f-qd6sh\" (UID: \"f1812216-a0b3-4ae2-9c2c-7086dc74163b\") " pod="openshift-operators/perses-operator-5bf474d74f-qd6sh" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.711928 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-qd6sh" Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.754922 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-srqjl"] Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.847380 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp"] Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.851778 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf"] Jan 28 07:01:08 crc kubenswrapper[4776]: W0128 07:01:08.868576 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf52dbbd1_d020_4074_93eb_706fff6e588b.slice/crio-d624afad838e7a728aaae3ec44000d4f1ff510d23887c197f3366494d854a1ed WatchSource:0}: Error finding container d624afad838e7a728aaae3ec44000d4f1ff510d23887c197f3366494d854a1ed: Status 404 returned error can't find the container with id d624afad838e7a728aaae3ec44000d4f1ff510d23887c197f3366494d854a1ed Jan 28 07:01:08 crc kubenswrapper[4776]: I0128 07:01:08.974804 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-qd6sh"] Jan 28 07:01:08 crc kubenswrapper[4776]: W0128 07:01:08.981377 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1812216_a0b3_4ae2_9c2c_7086dc74163b.slice/crio-c2440cef75acd5d00b542713447c65fb5e5ca884122e856c9e23445330974d56 WatchSource:0}: Error finding container c2440cef75acd5d00b542713447c65fb5e5ca884122e856c9e23445330974d56: Status 404 returned error can't find the container with id c2440cef75acd5d00b542713447c65fb5e5ca884122e856c9e23445330974d56 Jan 28 07:01:09 crc kubenswrapper[4776]: I0128 07:01:09.425008 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-srqjl" event={"ID":"215d3c95-e6d6-4022-a435-f6c30c630727","Type":"ContainerStarted","Data":"b42754c9638d2257642a2742b2999a469b5156388900e07e48ebc241be3fd930"} Jan 28 07:01:09 crc kubenswrapper[4776]: I0128 07:01:09.426211 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9v6mf" event={"ID":"633bf947-38aa-4444-911d-ea2f55433a93","Type":"ContainerStarted","Data":"ee2ad80220e8ba2da3729ab5833035d1485d7e4d2b1e5c92408490f7936489df"} Jan 28 07:01:09 crc kubenswrapper[4776]: I0128 07:01:09.427059 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-qd6sh" event={"ID":"f1812216-a0b3-4ae2-9c2c-7086dc74163b","Type":"ContainerStarted","Data":"c2440cef75acd5d00b542713447c65fb5e5ca884122e856c9e23445330974d56"} Jan 28 07:01:09 crc kubenswrapper[4776]: I0128 07:01:09.427849 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf" event={"ID":"194dde85-71e7-4d74-80c4-59e327ac851a","Type":"ContainerStarted","Data":"eca080f7d61c16808347aec290fd2295abecd0d00eb5e3ec97bbde2caa81565c"} Jan 28 07:01:09 crc kubenswrapper[4776]: I0128 07:01:09.428710 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp" event={"ID":"f52dbbd1-d020-4074-93eb-706fff6e588b","Type":"ContainerStarted","Data":"d624afad838e7a728aaae3ec44000d4f1ff510d23887c197f3366494d854a1ed"} Jan 28 07:01:18 crc kubenswrapper[4776]: I0128 07:01:18.488391 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-qd6sh" event={"ID":"f1812216-a0b3-4ae2-9c2c-7086dc74163b","Type":"ContainerStarted","Data":"e793aceed553f3f85449d4e2781f1cf5b8d37b26cf6fdf8f7e9ad504d6952e4d"} Jan 28 07:01:18 crc kubenswrapper[4776]: I0128 07:01:18.489374 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-qd6sh" Jan 28 07:01:18 crc kubenswrapper[4776]: I0128 07:01:18.491027 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf" event={"ID":"194dde85-71e7-4d74-80c4-59e327ac851a","Type":"ContainerStarted","Data":"c8a40ed00fc5329b33e880902e4605bf2ad8be3d1b9d3f0d86997cfe87a5a2f8"} Jan 28 07:01:18 crc kubenswrapper[4776]: I0128 07:01:18.492521 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp" event={"ID":"f52dbbd1-d020-4074-93eb-706fff6e588b","Type":"ContainerStarted","Data":"9fbc746b268b5c97897ed3f1f84c4d4c6f9c86de332c118302b16915a55a9024"} Jan 28 07:01:18 crc kubenswrapper[4776]: I0128 07:01:18.493929 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-srqjl" event={"ID":"215d3c95-e6d6-4022-a435-f6c30c630727","Type":"ContainerStarted","Data":"b3a81de9860f329f9b9c0bcea6d2a9d47364b00e2dfc5f57fd6d03439ad6fc63"} Jan 28 07:01:18 crc kubenswrapper[4776]: I0128 07:01:18.495708 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9v6mf" event={"ID":"633bf947-38aa-4444-911d-ea2f55433a93","Type":"ContainerStarted","Data":"0829a80ab397133b231ffdc0de22c3a0b24ff2a08a8a545103c971f9652acc96"} Jan 28 07:01:18 crc kubenswrapper[4776]: I0128 07:01:18.509285 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-qd6sh" podStartSLOduration=1.462604903 podStartE2EDuration="10.50925196s" podCreationTimestamp="2026-01-28 07:01:08 +0000 UTC" firstStartedPulling="2026-01-28 07:01:08.983598339 +0000 UTC m=+640.399258499" lastFinishedPulling="2026-01-28 07:01:18.030245396 +0000 UTC m=+649.445905556" observedRunningTime="2026-01-28 07:01:18.508950252 +0000 UTC m=+649.924610432" watchObservedRunningTime="2026-01-28 07:01:18.50925196 +0000 UTC m=+649.924912120" Jan 28 07:01:18 crc kubenswrapper[4776]: I0128 07:01:18.513163 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-srqjl" Jan 28 07:01:18 crc kubenswrapper[4776]: I0128 07:01:18.535247 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9v6mf" podStartSLOduration=2.109988355 podStartE2EDuration="11.535224363s" podCreationTimestamp="2026-01-28 07:01:07 +0000 UTC" firstStartedPulling="2026-01-28 07:01:08.574997173 +0000 UTC m=+639.990657333" lastFinishedPulling="2026-01-28 07:01:18.000233171 +0000 UTC m=+649.415893341" observedRunningTime="2026-01-28 07:01:18.531113511 +0000 UTC m=+649.946773691" watchObservedRunningTime="2026-01-28 07:01:18.535224363 +0000 UTC m=+649.950884523" Jan 28 07:01:18 crc kubenswrapper[4776]: I0128 07:01:18.551060 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp" podStartSLOduration=2.423092446 podStartE2EDuration="11.551021488s" podCreationTimestamp="2026-01-28 07:01:07 +0000 UTC" firstStartedPulling="2026-01-28 07:01:08.871024965 +0000 UTC m=+640.286685125" lastFinishedPulling="2026-01-28 07:01:17.998953997 +0000 UTC m=+649.414614167" observedRunningTime="2026-01-28 07:01:18.548752095 +0000 UTC m=+649.964412255" watchObservedRunningTime="2026-01-28 07:01:18.551021488 +0000 UTC m=+649.966681648" Jan 28 07:01:18 crc kubenswrapper[4776]: I0128 07:01:18.580763 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf" podStartSLOduration=2.412650636 podStartE2EDuration="11.580741964s" podCreationTimestamp="2026-01-28 07:01:07 +0000 UTC" firstStartedPulling="2026-01-28 07:01:08.861769328 +0000 UTC m=+640.277429488" lastFinishedPulling="2026-01-28 07:01:18.029860656 +0000 UTC m=+649.445520816" observedRunningTime="2026-01-28 07:01:18.568072566 +0000 UTC m=+649.983732736" watchObservedRunningTime="2026-01-28 07:01:18.580741964 +0000 UTC m=+649.996402124" Jan 28 07:01:18 crc kubenswrapper[4776]: I0128 07:01:18.591759 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-srqjl" Jan 28 07:01:18 crc kubenswrapper[4776]: I0128 07:01:18.612220 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-srqjl" podStartSLOduration=1.348515653 podStartE2EDuration="10.612202999s" podCreationTimestamp="2026-01-28 07:01:08 +0000 UTC" firstStartedPulling="2026-01-28 07:01:08.771491576 +0000 UTC m=+640.187151736" lastFinishedPulling="2026-01-28 07:01:18.035178922 +0000 UTC m=+649.450839082" observedRunningTime="2026-01-28 07:01:18.60606553 +0000 UTC m=+650.021725690" watchObservedRunningTime="2026-01-28 07:01:18.612202999 +0000 UTC m=+650.027863159" Jan 28 07:01:25 crc kubenswrapper[4776]: I0128 07:01:25.386661 4776 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 28 07:01:28 crc kubenswrapper[4776]: I0128 07:01:28.715216 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-qd6sh" Jan 28 07:01:48 crc kubenswrapper[4776]: I0128 07:01:48.581977 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj"] Jan 28 07:01:48 crc kubenswrapper[4776]: I0128 07:01:48.584221 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" Jan 28 07:01:48 crc kubenswrapper[4776]: I0128 07:01:48.587468 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 28 07:01:48 crc kubenswrapper[4776]: I0128 07:01:48.592001 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj"] Jan 28 07:01:48 crc kubenswrapper[4776]: I0128 07:01:48.712421 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj\" (UID: \"8d1bba84-5283-4516-94aa-2b7fa90c5e6d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" Jan 28 07:01:48 crc kubenswrapper[4776]: I0128 07:01:48.712478 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj\" (UID: \"8d1bba84-5283-4516-94aa-2b7fa90c5e6d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" Jan 28 07:01:48 crc kubenswrapper[4776]: I0128 07:01:48.712509 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd7d9\" (UniqueName: \"kubernetes.io/projected/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-kube-api-access-cd7d9\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj\" (UID: \"8d1bba84-5283-4516-94aa-2b7fa90c5e6d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" Jan 28 07:01:48 crc kubenswrapper[4776]: I0128 07:01:48.813435 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj\" (UID: \"8d1bba84-5283-4516-94aa-2b7fa90c5e6d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" Jan 28 07:01:48 crc kubenswrapper[4776]: I0128 07:01:48.813499 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj\" (UID: \"8d1bba84-5283-4516-94aa-2b7fa90c5e6d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" Jan 28 07:01:48 crc kubenswrapper[4776]: I0128 07:01:48.813528 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd7d9\" (UniqueName: \"kubernetes.io/projected/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-kube-api-access-cd7d9\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj\" (UID: \"8d1bba84-5283-4516-94aa-2b7fa90c5e6d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" Jan 28 07:01:48 crc kubenswrapper[4776]: I0128 07:01:48.814003 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj\" (UID: \"8d1bba84-5283-4516-94aa-2b7fa90c5e6d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" Jan 28 07:01:48 crc kubenswrapper[4776]: I0128 07:01:48.814043 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj\" (UID: \"8d1bba84-5283-4516-94aa-2b7fa90c5e6d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" Jan 28 07:01:48 crc kubenswrapper[4776]: I0128 07:01:48.830868 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd7d9\" (UniqueName: \"kubernetes.io/projected/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-kube-api-access-cd7d9\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj\" (UID: \"8d1bba84-5283-4516-94aa-2b7fa90c5e6d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" Jan 28 07:01:48 crc kubenswrapper[4776]: I0128 07:01:48.901997 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" Jan 28 07:01:49 crc kubenswrapper[4776]: I0128 07:01:49.189181 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj"] Jan 28 07:01:49 crc kubenswrapper[4776]: I0128 07:01:49.677855 4776 generic.go:334] "Generic (PLEG): container finished" podID="8d1bba84-5283-4516-94aa-2b7fa90c5e6d" containerID="f553a0ad4fe465c68fccae167a8e40722cdb1eaea574b72687058799b2df43cd" exitCode=0 Jan 28 07:01:49 crc kubenswrapper[4776]: I0128 07:01:49.677921 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" event={"ID":"8d1bba84-5283-4516-94aa-2b7fa90c5e6d","Type":"ContainerDied","Data":"f553a0ad4fe465c68fccae167a8e40722cdb1eaea574b72687058799b2df43cd"} Jan 28 07:01:49 crc kubenswrapper[4776]: I0128 07:01:49.678243 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" event={"ID":"8d1bba84-5283-4516-94aa-2b7fa90c5e6d","Type":"ContainerStarted","Data":"f0d0c9737556523845ed7f3c49c099c21fbfdacc41ea44d4c838ebd79f9eecb9"} Jan 28 07:01:50 crc kubenswrapper[4776]: I0128 07:01:50.924930 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6zfw9"] Jan 28 07:01:50 crc kubenswrapper[4776]: I0128 07:01:50.926648 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6zfw9" Jan 28 07:01:50 crc kubenswrapper[4776]: I0128 07:01:50.940445 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6zfw9"] Jan 28 07:01:51 crc kubenswrapper[4776]: I0128 07:01:51.039482 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42f9f56f-3ae1-4281-b6ad-68f0e862165e-utilities\") pod \"redhat-operators-6zfw9\" (UID: \"42f9f56f-3ae1-4281-b6ad-68f0e862165e\") " pod="openshift-marketplace/redhat-operators-6zfw9" Jan 28 07:01:51 crc kubenswrapper[4776]: I0128 07:01:51.039593 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42f9f56f-3ae1-4281-b6ad-68f0e862165e-catalog-content\") pod \"redhat-operators-6zfw9\" (UID: \"42f9f56f-3ae1-4281-b6ad-68f0e862165e\") " pod="openshift-marketplace/redhat-operators-6zfw9" Jan 28 07:01:51 crc kubenswrapper[4776]: I0128 07:01:51.039946 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qhdv\" (UniqueName: \"kubernetes.io/projected/42f9f56f-3ae1-4281-b6ad-68f0e862165e-kube-api-access-6qhdv\") pod \"redhat-operators-6zfw9\" (UID: \"42f9f56f-3ae1-4281-b6ad-68f0e862165e\") " pod="openshift-marketplace/redhat-operators-6zfw9" Jan 28 07:01:51 crc kubenswrapper[4776]: I0128 07:01:51.141791 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qhdv\" (UniqueName: \"kubernetes.io/projected/42f9f56f-3ae1-4281-b6ad-68f0e862165e-kube-api-access-6qhdv\") pod \"redhat-operators-6zfw9\" (UID: \"42f9f56f-3ae1-4281-b6ad-68f0e862165e\") " pod="openshift-marketplace/redhat-operators-6zfw9" Jan 28 07:01:51 crc kubenswrapper[4776]: I0128 07:01:51.142189 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42f9f56f-3ae1-4281-b6ad-68f0e862165e-utilities\") pod \"redhat-operators-6zfw9\" (UID: \"42f9f56f-3ae1-4281-b6ad-68f0e862165e\") " pod="openshift-marketplace/redhat-operators-6zfw9" Jan 28 07:01:51 crc kubenswrapper[4776]: I0128 07:01:51.142209 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42f9f56f-3ae1-4281-b6ad-68f0e862165e-catalog-content\") pod \"redhat-operators-6zfw9\" (UID: \"42f9f56f-3ae1-4281-b6ad-68f0e862165e\") " pod="openshift-marketplace/redhat-operators-6zfw9" Jan 28 07:01:51 crc kubenswrapper[4776]: I0128 07:01:51.142777 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42f9f56f-3ae1-4281-b6ad-68f0e862165e-utilities\") pod \"redhat-operators-6zfw9\" (UID: \"42f9f56f-3ae1-4281-b6ad-68f0e862165e\") " pod="openshift-marketplace/redhat-operators-6zfw9" Jan 28 07:01:51 crc kubenswrapper[4776]: I0128 07:01:51.143115 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42f9f56f-3ae1-4281-b6ad-68f0e862165e-catalog-content\") pod \"redhat-operators-6zfw9\" (UID: \"42f9f56f-3ae1-4281-b6ad-68f0e862165e\") " pod="openshift-marketplace/redhat-operators-6zfw9" Jan 28 07:01:51 crc kubenswrapper[4776]: I0128 07:01:51.162538 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qhdv\" (UniqueName: \"kubernetes.io/projected/42f9f56f-3ae1-4281-b6ad-68f0e862165e-kube-api-access-6qhdv\") pod \"redhat-operators-6zfw9\" (UID: \"42f9f56f-3ae1-4281-b6ad-68f0e862165e\") " pod="openshift-marketplace/redhat-operators-6zfw9" Jan 28 07:01:51 crc kubenswrapper[4776]: I0128 07:01:51.278292 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6zfw9" Jan 28 07:01:51 crc kubenswrapper[4776]: I0128 07:01:51.471959 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6zfw9"] Jan 28 07:01:51 crc kubenswrapper[4776]: W0128 07:01:51.472264 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42f9f56f_3ae1_4281_b6ad_68f0e862165e.slice/crio-613e732be4ab2cdd243c97031deabd7f436ca659f3e820ef07cb7d83555ff04f WatchSource:0}: Error finding container 613e732be4ab2cdd243c97031deabd7f436ca659f3e820ef07cb7d83555ff04f: Status 404 returned error can't find the container with id 613e732be4ab2cdd243c97031deabd7f436ca659f3e820ef07cb7d83555ff04f Jan 28 07:01:51 crc kubenswrapper[4776]: I0128 07:01:51.690916 4776 generic.go:334] "Generic (PLEG): container finished" podID="8d1bba84-5283-4516-94aa-2b7fa90c5e6d" containerID="3429df36575e1ad4f781869fcbc9a9fd76a3fe0ee0fc1570ac37c3d5a0376d49" exitCode=0 Jan 28 07:01:51 crc kubenswrapper[4776]: I0128 07:01:51.691014 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" event={"ID":"8d1bba84-5283-4516-94aa-2b7fa90c5e6d","Type":"ContainerDied","Data":"3429df36575e1ad4f781869fcbc9a9fd76a3fe0ee0fc1570ac37c3d5a0376d49"} Jan 28 07:01:51 crc kubenswrapper[4776]: I0128 07:01:51.692573 4776 generic.go:334] "Generic (PLEG): container finished" podID="42f9f56f-3ae1-4281-b6ad-68f0e862165e" containerID="e75a8127747d0c14afd0f2559a16f011fc36fd29539ee89994f7c340f5dfed1b" exitCode=0 Jan 28 07:01:51 crc kubenswrapper[4776]: I0128 07:01:51.692614 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zfw9" event={"ID":"42f9f56f-3ae1-4281-b6ad-68f0e862165e","Type":"ContainerDied","Data":"e75a8127747d0c14afd0f2559a16f011fc36fd29539ee89994f7c340f5dfed1b"} Jan 28 07:01:51 crc kubenswrapper[4776]: I0128 07:01:51.692641 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zfw9" event={"ID":"42f9f56f-3ae1-4281-b6ad-68f0e862165e","Type":"ContainerStarted","Data":"613e732be4ab2cdd243c97031deabd7f436ca659f3e820ef07cb7d83555ff04f"} Jan 28 07:01:52 crc kubenswrapper[4776]: I0128 07:01:52.704340 4776 generic.go:334] "Generic (PLEG): container finished" podID="8d1bba84-5283-4516-94aa-2b7fa90c5e6d" containerID="573135bb90aa23fae39df4b1ccb040a5a0cadc4cf5b6cbd449a59f60f596337e" exitCode=0 Jan 28 07:01:52 crc kubenswrapper[4776]: I0128 07:01:52.704766 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" event={"ID":"8d1bba84-5283-4516-94aa-2b7fa90c5e6d","Type":"ContainerDied","Data":"573135bb90aa23fae39df4b1ccb040a5a0cadc4cf5b6cbd449a59f60f596337e"} Jan 28 07:01:53 crc kubenswrapper[4776]: I0128 07:01:53.711139 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zfw9" event={"ID":"42f9f56f-3ae1-4281-b6ad-68f0e862165e","Type":"ContainerStarted","Data":"83043eae23bac324ec85a2fc54453d7ac3b5c5d8c3010de391ba5e7353e649af"} Jan 28 07:01:53 crc kubenswrapper[4776]: I0128 07:01:53.977678 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" Jan 28 07:01:54 crc kubenswrapper[4776]: I0128 07:01:54.085921 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd7d9\" (UniqueName: \"kubernetes.io/projected/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-kube-api-access-cd7d9\") pod \"8d1bba84-5283-4516-94aa-2b7fa90c5e6d\" (UID: \"8d1bba84-5283-4516-94aa-2b7fa90c5e6d\") " Jan 28 07:01:54 crc kubenswrapper[4776]: I0128 07:01:54.086029 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-util\") pod \"8d1bba84-5283-4516-94aa-2b7fa90c5e6d\" (UID: \"8d1bba84-5283-4516-94aa-2b7fa90c5e6d\") " Jan 28 07:01:54 crc kubenswrapper[4776]: I0128 07:01:54.086123 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-bundle\") pod \"8d1bba84-5283-4516-94aa-2b7fa90c5e6d\" (UID: \"8d1bba84-5283-4516-94aa-2b7fa90c5e6d\") " Jan 28 07:01:54 crc kubenswrapper[4776]: I0128 07:01:54.086632 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-bundle" (OuterVolumeSpecName: "bundle") pod "8d1bba84-5283-4516-94aa-2b7fa90c5e6d" (UID: "8d1bba84-5283-4516-94aa-2b7fa90c5e6d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:01:54 crc kubenswrapper[4776]: I0128 07:01:54.095741 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-kube-api-access-cd7d9" (OuterVolumeSpecName: "kube-api-access-cd7d9") pod "8d1bba84-5283-4516-94aa-2b7fa90c5e6d" (UID: "8d1bba84-5283-4516-94aa-2b7fa90c5e6d"). InnerVolumeSpecName "kube-api-access-cd7d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:01:54 crc kubenswrapper[4776]: I0128 07:01:54.109900 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-util" (OuterVolumeSpecName: "util") pod "8d1bba84-5283-4516-94aa-2b7fa90c5e6d" (UID: "8d1bba84-5283-4516-94aa-2b7fa90c5e6d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:01:54 crc kubenswrapper[4776]: I0128 07:01:54.188087 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd7d9\" (UniqueName: \"kubernetes.io/projected/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-kube-api-access-cd7d9\") on node \"crc\" DevicePath \"\"" Jan 28 07:01:54 crc kubenswrapper[4776]: I0128 07:01:54.188119 4776 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-util\") on node \"crc\" DevicePath \"\"" Jan 28 07:01:54 crc kubenswrapper[4776]: I0128 07:01:54.188128 4776 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d1bba84-5283-4516-94aa-2b7fa90c5e6d-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:01:54 crc kubenswrapper[4776]: I0128 07:01:54.720645 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" event={"ID":"8d1bba84-5283-4516-94aa-2b7fa90c5e6d","Type":"ContainerDied","Data":"f0d0c9737556523845ed7f3c49c099c21fbfdacc41ea44d4c838ebd79f9eecb9"} Jan 28 07:01:54 crc kubenswrapper[4776]: I0128 07:01:54.720919 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0d0c9737556523845ed7f3c49c099c21fbfdacc41ea44d4c838ebd79f9eecb9" Jan 28 07:01:54 crc kubenswrapper[4776]: I0128 07:01:54.721018 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj" Jan 28 07:01:54 crc kubenswrapper[4776]: I0128 07:01:54.725671 4776 generic.go:334] "Generic (PLEG): container finished" podID="42f9f56f-3ae1-4281-b6ad-68f0e862165e" containerID="83043eae23bac324ec85a2fc54453d7ac3b5c5d8c3010de391ba5e7353e649af" exitCode=0 Jan 28 07:01:54 crc kubenswrapper[4776]: I0128 07:01:54.725933 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zfw9" event={"ID":"42f9f56f-3ae1-4281-b6ad-68f0e862165e","Type":"ContainerDied","Data":"83043eae23bac324ec85a2fc54453d7ac3b5c5d8c3010de391ba5e7353e649af"} Jan 28 07:01:55 crc kubenswrapper[4776]: I0128 07:01:55.734435 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zfw9" event={"ID":"42f9f56f-3ae1-4281-b6ad-68f0e862165e","Type":"ContainerStarted","Data":"faf7c3a8af97638440acd79834db67218f93ac4edc2f9e441ae219c220dd067e"} Jan 28 07:01:55 crc kubenswrapper[4776]: I0128 07:01:55.752949 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6zfw9" podStartSLOduration=2.119429116 podStartE2EDuration="5.752927614s" podCreationTimestamp="2026-01-28 07:01:50 +0000 UTC" firstStartedPulling="2026-01-28 07:01:51.69379607 +0000 UTC m=+683.109456230" lastFinishedPulling="2026-01-28 07:01:55.327294558 +0000 UTC m=+686.742954728" observedRunningTime="2026-01-28 07:01:55.750278082 +0000 UTC m=+687.165938242" watchObservedRunningTime="2026-01-28 07:01:55.752927614 +0000 UTC m=+687.168587774" Jan 28 07:01:57 crc kubenswrapper[4776]: I0128 07:01:57.548120 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-5x9ww"] Jan 28 07:01:57 crc kubenswrapper[4776]: E0128 07:01:57.548362 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1bba84-5283-4516-94aa-2b7fa90c5e6d" containerName="util" Jan 28 07:01:57 crc kubenswrapper[4776]: I0128 07:01:57.548375 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1bba84-5283-4516-94aa-2b7fa90c5e6d" containerName="util" Jan 28 07:01:57 crc kubenswrapper[4776]: E0128 07:01:57.548384 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1bba84-5283-4516-94aa-2b7fa90c5e6d" containerName="pull" Jan 28 07:01:57 crc kubenswrapper[4776]: I0128 07:01:57.548390 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1bba84-5283-4516-94aa-2b7fa90c5e6d" containerName="pull" Jan 28 07:01:57 crc kubenswrapper[4776]: E0128 07:01:57.548397 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1bba84-5283-4516-94aa-2b7fa90c5e6d" containerName="extract" Jan 28 07:01:57 crc kubenswrapper[4776]: I0128 07:01:57.548403 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1bba84-5283-4516-94aa-2b7fa90c5e6d" containerName="extract" Jan 28 07:01:57 crc kubenswrapper[4776]: I0128 07:01:57.548519 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1bba84-5283-4516-94aa-2b7fa90c5e6d" containerName="extract" Jan 28 07:01:57 crc kubenswrapper[4776]: I0128 07:01:57.549005 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-5x9ww" Jan 28 07:01:57 crc kubenswrapper[4776]: I0128 07:01:57.551028 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 28 07:01:57 crc kubenswrapper[4776]: I0128 07:01:57.551066 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-7542b" Jan 28 07:01:57 crc kubenswrapper[4776]: I0128 07:01:57.551525 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 28 07:01:57 crc kubenswrapper[4776]: I0128 07:01:57.562504 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-5x9ww"] Jan 28 07:01:57 crc kubenswrapper[4776]: I0128 07:01:57.631881 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnqjn\" (UniqueName: \"kubernetes.io/projected/180b60f1-288a-4292-9aab-4322b1d1bce2-kube-api-access-mnqjn\") pod \"nmstate-operator-646758c888-5x9ww\" (UID: \"180b60f1-288a-4292-9aab-4322b1d1bce2\") " pod="openshift-nmstate/nmstate-operator-646758c888-5x9ww" Jan 28 07:01:57 crc kubenswrapper[4776]: I0128 07:01:57.733560 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnqjn\" (UniqueName: \"kubernetes.io/projected/180b60f1-288a-4292-9aab-4322b1d1bce2-kube-api-access-mnqjn\") pod \"nmstate-operator-646758c888-5x9ww\" (UID: \"180b60f1-288a-4292-9aab-4322b1d1bce2\") " pod="openshift-nmstate/nmstate-operator-646758c888-5x9ww" Jan 28 07:01:57 crc kubenswrapper[4776]: I0128 07:01:57.760785 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnqjn\" (UniqueName: \"kubernetes.io/projected/180b60f1-288a-4292-9aab-4322b1d1bce2-kube-api-access-mnqjn\") pod \"nmstate-operator-646758c888-5x9ww\" (UID: \"180b60f1-288a-4292-9aab-4322b1d1bce2\") " pod="openshift-nmstate/nmstate-operator-646758c888-5x9ww" Jan 28 07:01:57 crc kubenswrapper[4776]: I0128 07:01:57.865230 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-5x9ww" Jan 28 07:01:58 crc kubenswrapper[4776]: I0128 07:01:58.049309 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-5x9ww"] Jan 28 07:01:58 crc kubenswrapper[4776]: I0128 07:01:58.755067 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-5x9ww" event={"ID":"180b60f1-288a-4292-9aab-4322b1d1bce2","Type":"ContainerStarted","Data":"c8bb4c9059516cd2bb1f52d64be5eba0fdaa06c5d89429209c798327088ae7fa"} Jan 28 07:02:01 crc kubenswrapper[4776]: I0128 07:02:01.296717 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6zfw9" Jan 28 07:02:01 crc kubenswrapper[4776]: I0128 07:02:01.297110 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6zfw9" Jan 28 07:02:01 crc kubenswrapper[4776]: I0128 07:02:01.368731 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6zfw9" Jan 28 07:02:01 crc kubenswrapper[4776]: I0128 07:02:01.812655 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6zfw9" Jan 28 07:02:02 crc kubenswrapper[4776]: I0128 07:02:02.779696 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-5x9ww" event={"ID":"180b60f1-288a-4292-9aab-4322b1d1bce2","Type":"ContainerStarted","Data":"2c900502599b77908a72c875806c2522761011d4195cd8bf3b7b94ff5e1aab39"} Jan 28 07:02:02 crc kubenswrapper[4776]: I0128 07:02:02.806762 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-5x9ww" podStartSLOduration=2.066208373 podStartE2EDuration="5.806730152s" podCreationTimestamp="2026-01-28 07:01:57 +0000 UTC" firstStartedPulling="2026-01-28 07:01:58.060435724 +0000 UTC m=+689.476095884" lastFinishedPulling="2026-01-28 07:02:01.800957503 +0000 UTC m=+693.216617663" observedRunningTime="2026-01-28 07:02:02.795672519 +0000 UTC m=+694.211332679" watchObservedRunningTime="2026-01-28 07:02:02.806730152 +0000 UTC m=+694.222390312" Jan 28 07:02:03 crc kubenswrapper[4776]: I0128 07:02:03.744666 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6zfw9"] Jan 28 07:02:03 crc kubenswrapper[4776]: I0128 07:02:03.785640 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6zfw9" podUID="42f9f56f-3ae1-4281-b6ad-68f0e862165e" containerName="registry-server" containerID="cri-o://faf7c3a8af97638440acd79834db67218f93ac4edc2f9e441ae219c220dd067e" gracePeriod=2 Jan 28 07:02:03 crc kubenswrapper[4776]: I0128 07:02:03.894924 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-qgksg"] Jan 28 07:02:03 crc kubenswrapper[4776]: I0128 07:02:03.896037 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-qgksg" Jan 28 07:02:03 crc kubenswrapper[4776]: I0128 07:02:03.902961 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-5w5cb" Jan 28 07:02:03 crc kubenswrapper[4776]: I0128 07:02:03.913906 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-qgksg"] Jan 28 07:02:03 crc kubenswrapper[4776]: I0128 07:02:03.919790 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-mjfqk"] Jan 28 07:02:03 crc kubenswrapper[4776]: I0128 07:02:03.920687 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mjfqk" Jan 28 07:02:03 crc kubenswrapper[4776]: I0128 07:02:03.922648 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 28 07:02:03 crc kubenswrapper[4776]: I0128 07:02:03.931945 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-fn6dp"] Jan 28 07:02:03 crc kubenswrapper[4776]: I0128 07:02:03.932999 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fn6dp" Jan 28 07:02:03 crc kubenswrapper[4776]: I0128 07:02:03.965805 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-mjfqk"] Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.038951 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/087b920d-366b-475c-85f2-e5512596d3f8-dbus-socket\") pod \"nmstate-handler-fn6dp\" (UID: \"087b920d-366b-475c-85f2-e5512596d3f8\") " pod="openshift-nmstate/nmstate-handler-fn6dp" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.039008 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4js2t\" (UniqueName: \"kubernetes.io/projected/087b920d-366b-475c-85f2-e5512596d3f8-kube-api-access-4js2t\") pod \"nmstate-handler-fn6dp\" (UID: \"087b920d-366b-475c-85f2-e5512596d3f8\") " pod="openshift-nmstate/nmstate-handler-fn6dp" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.039052 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcmpq\" (UniqueName: \"kubernetes.io/projected/ff12b52a-7e92-45bd-afd9-e0b577a8607d-kube-api-access-bcmpq\") pod \"nmstate-webhook-8474b5b9d8-mjfqk\" (UID: \"ff12b52a-7e92-45bd-afd9-e0b577a8607d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mjfqk" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.039104 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdl8x\" (UniqueName: \"kubernetes.io/projected/2f1d6d84-d95e-4423-a7c1-7fa987beff1c-kube-api-access-tdl8x\") pod \"nmstate-metrics-54757c584b-qgksg\" (UID: \"2f1d6d84-d95e-4423-a7c1-7fa987beff1c\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-qgksg" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.039134 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ff12b52a-7e92-45bd-afd9-e0b577a8607d-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-mjfqk\" (UID: \"ff12b52a-7e92-45bd-afd9-e0b577a8607d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mjfqk" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.039192 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/087b920d-366b-475c-85f2-e5512596d3f8-ovs-socket\") pod \"nmstate-handler-fn6dp\" (UID: \"087b920d-366b-475c-85f2-e5512596d3f8\") " pod="openshift-nmstate/nmstate-handler-fn6dp" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.039235 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/087b920d-366b-475c-85f2-e5512596d3f8-nmstate-lock\") pod \"nmstate-handler-fn6dp\" (UID: \"087b920d-366b-475c-85f2-e5512596d3f8\") " pod="openshift-nmstate/nmstate-handler-fn6dp" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.053585 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-49nnl"] Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.054834 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-49nnl" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.057052 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-9q9gl" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.057966 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.065041 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.066292 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-49nnl"] Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.140516 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/087b920d-366b-475c-85f2-e5512596d3f8-dbus-socket\") pod \"nmstate-handler-fn6dp\" (UID: \"087b920d-366b-475c-85f2-e5512596d3f8\") " pod="openshift-nmstate/nmstate-handler-fn6dp" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.140757 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4js2t\" (UniqueName: \"kubernetes.io/projected/087b920d-366b-475c-85f2-e5512596d3f8-kube-api-access-4js2t\") pod \"nmstate-handler-fn6dp\" (UID: \"087b920d-366b-475c-85f2-e5512596d3f8\") " pod="openshift-nmstate/nmstate-handler-fn6dp" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.140829 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcmpq\" (UniqueName: \"kubernetes.io/projected/ff12b52a-7e92-45bd-afd9-e0b577a8607d-kube-api-access-bcmpq\") pod \"nmstate-webhook-8474b5b9d8-mjfqk\" (UID: \"ff12b52a-7e92-45bd-afd9-e0b577a8607d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mjfqk" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.140913 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdl8x\" (UniqueName: \"kubernetes.io/projected/2f1d6d84-d95e-4423-a7c1-7fa987beff1c-kube-api-access-tdl8x\") pod \"nmstate-metrics-54757c584b-qgksg\" (UID: \"2f1d6d84-d95e-4423-a7c1-7fa987beff1c\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-qgksg" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.141004 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ff12b52a-7e92-45bd-afd9-e0b577a8607d-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-mjfqk\" (UID: \"ff12b52a-7e92-45bd-afd9-e0b577a8607d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mjfqk" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.141086 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/087b920d-366b-475c-85f2-e5512596d3f8-ovs-socket\") pod \"nmstate-handler-fn6dp\" (UID: \"087b920d-366b-475c-85f2-e5512596d3f8\") " pod="openshift-nmstate/nmstate-handler-fn6dp" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.141133 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/087b920d-366b-475c-85f2-e5512596d3f8-dbus-socket\") pod \"nmstate-handler-fn6dp\" (UID: \"087b920d-366b-475c-85f2-e5512596d3f8\") " pod="openshift-nmstate/nmstate-handler-fn6dp" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.141171 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/087b920d-366b-475c-85f2-e5512596d3f8-nmstate-lock\") pod \"nmstate-handler-fn6dp\" (UID: \"087b920d-366b-475c-85f2-e5512596d3f8\") " pod="openshift-nmstate/nmstate-handler-fn6dp" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.141268 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/087b920d-366b-475c-85f2-e5512596d3f8-ovs-socket\") pod \"nmstate-handler-fn6dp\" (UID: \"087b920d-366b-475c-85f2-e5512596d3f8\") " pod="openshift-nmstate/nmstate-handler-fn6dp" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.141317 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/087b920d-366b-475c-85f2-e5512596d3f8-nmstate-lock\") pod \"nmstate-handler-fn6dp\" (UID: \"087b920d-366b-475c-85f2-e5512596d3f8\") " pod="openshift-nmstate/nmstate-handler-fn6dp" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.151384 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ff12b52a-7e92-45bd-afd9-e0b577a8607d-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-mjfqk\" (UID: \"ff12b52a-7e92-45bd-afd9-e0b577a8607d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mjfqk" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.158156 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4js2t\" (UniqueName: \"kubernetes.io/projected/087b920d-366b-475c-85f2-e5512596d3f8-kube-api-access-4js2t\") pod \"nmstate-handler-fn6dp\" (UID: \"087b920d-366b-475c-85f2-e5512596d3f8\") " pod="openshift-nmstate/nmstate-handler-fn6dp" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.160106 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcmpq\" (UniqueName: \"kubernetes.io/projected/ff12b52a-7e92-45bd-afd9-e0b577a8607d-kube-api-access-bcmpq\") pod \"nmstate-webhook-8474b5b9d8-mjfqk\" (UID: \"ff12b52a-7e92-45bd-afd9-e0b577a8607d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mjfqk" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.161107 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdl8x\" (UniqueName: \"kubernetes.io/projected/2f1d6d84-d95e-4423-a7c1-7fa987beff1c-kube-api-access-tdl8x\") pod \"nmstate-metrics-54757c584b-qgksg\" (UID: \"2f1d6d84-d95e-4423-a7c1-7fa987beff1c\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-qgksg" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.234832 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7468d54cd-lsxp4"] Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.235754 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.236325 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-qgksg" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.242524 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/59581e1b-5fa1-4649-b461-20815879a250-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-49nnl\" (UID: \"59581e1b-5fa1-4649-b461-20815879a250\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-49nnl" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.242604 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/59581e1b-5fa1-4649-b461-20815879a250-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-49nnl\" (UID: \"59581e1b-5fa1-4649-b461-20815879a250\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-49nnl" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.242822 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmlm7\" (UniqueName: \"kubernetes.io/projected/59581e1b-5fa1-4649-b461-20815879a250-kube-api-access-mmlm7\") pod \"nmstate-console-plugin-7754f76f8b-49nnl\" (UID: \"59581e1b-5fa1-4649-b461-20815879a250\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-49nnl" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.248420 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7468d54cd-lsxp4"] Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.256614 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mjfqk" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.271054 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fn6dp" Jan 28 07:02:04 crc kubenswrapper[4776]: W0128 07:02:04.307736 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod087b920d_366b_475c_85f2_e5512596d3f8.slice/crio-3f774cecb6158c0aeeee076714ef407746fa04e30e2f8b46c8f37cc6a77bc30d WatchSource:0}: Error finding container 3f774cecb6158c0aeeee076714ef407746fa04e30e2f8b46c8f37cc6a77bc30d: Status 404 returned error can't find the container with id 3f774cecb6158c0aeeee076714ef407746fa04e30e2f8b46c8f37cc6a77bc30d Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.348123 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/59581e1b-5fa1-4649-b461-20815879a250-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-49nnl\" (UID: \"59581e1b-5fa1-4649-b461-20815879a250\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-49nnl" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.348180 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5a0d0f0-208f-4bf7-963a-d5e36660a786-service-ca\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.348200 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5a0d0f0-208f-4bf7-963a-d5e36660a786-oauth-serving-cert\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.348217 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r8vp\" (UniqueName: \"kubernetes.io/projected/e5a0d0f0-208f-4bf7-963a-d5e36660a786-kube-api-access-4r8vp\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.348253 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmlm7\" (UniqueName: \"kubernetes.io/projected/59581e1b-5fa1-4649-b461-20815879a250-kube-api-access-mmlm7\") pod \"nmstate-console-plugin-7754f76f8b-49nnl\" (UID: \"59581e1b-5fa1-4649-b461-20815879a250\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-49nnl" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.348281 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5a0d0f0-208f-4bf7-963a-d5e36660a786-console-config\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.348299 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5a0d0f0-208f-4bf7-963a-d5e36660a786-console-serving-cert\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.348316 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5a0d0f0-208f-4bf7-963a-d5e36660a786-trusted-ca-bundle\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.348339 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/59581e1b-5fa1-4649-b461-20815879a250-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-49nnl\" (UID: \"59581e1b-5fa1-4649-b461-20815879a250\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-49nnl" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.348355 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5a0d0f0-208f-4bf7-963a-d5e36660a786-console-oauth-config\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.350261 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/59581e1b-5fa1-4649-b461-20815879a250-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-49nnl\" (UID: \"59581e1b-5fa1-4649-b461-20815879a250\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-49nnl" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.355269 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/59581e1b-5fa1-4649-b461-20815879a250-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-49nnl\" (UID: \"59581e1b-5fa1-4649-b461-20815879a250\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-49nnl" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.386959 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmlm7\" (UniqueName: \"kubernetes.io/projected/59581e1b-5fa1-4649-b461-20815879a250-kube-api-access-mmlm7\") pod \"nmstate-console-plugin-7754f76f8b-49nnl\" (UID: \"59581e1b-5fa1-4649-b461-20815879a250\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-49nnl" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.449722 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5a0d0f0-208f-4bf7-963a-d5e36660a786-console-config\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.449765 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5a0d0f0-208f-4bf7-963a-d5e36660a786-console-serving-cert\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.449783 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5a0d0f0-208f-4bf7-963a-d5e36660a786-trusted-ca-bundle\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.449804 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5a0d0f0-208f-4bf7-963a-d5e36660a786-console-oauth-config\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.449860 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5a0d0f0-208f-4bf7-963a-d5e36660a786-service-ca\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.449878 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5a0d0f0-208f-4bf7-963a-d5e36660a786-oauth-serving-cert\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.449898 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r8vp\" (UniqueName: \"kubernetes.io/projected/e5a0d0f0-208f-4bf7-963a-d5e36660a786-kube-api-access-4r8vp\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.450862 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5a0d0f0-208f-4bf7-963a-d5e36660a786-console-config\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.451144 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5a0d0f0-208f-4bf7-963a-d5e36660a786-oauth-serving-cert\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.451228 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5a0d0f0-208f-4bf7-963a-d5e36660a786-service-ca\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.452397 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5a0d0f0-208f-4bf7-963a-d5e36660a786-trusted-ca-bundle\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.456982 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5a0d0f0-208f-4bf7-963a-d5e36660a786-console-oauth-config\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.457453 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5a0d0f0-208f-4bf7-963a-d5e36660a786-console-serving-cert\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.480095 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r8vp\" (UniqueName: \"kubernetes.io/projected/e5a0d0f0-208f-4bf7-963a-d5e36660a786-kube-api-access-4r8vp\") pod \"console-7468d54cd-lsxp4\" (UID: \"e5a0d0f0-208f-4bf7-963a-d5e36660a786\") " pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.593313 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-mjfqk"] Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.604163 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.675048 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-49nnl" Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.700262 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-qgksg"] Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.793527 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mjfqk" event={"ID":"ff12b52a-7e92-45bd-afd9-e0b577a8607d","Type":"ContainerStarted","Data":"d600ba54153573cd28b35ad47f066b319a67205446c8f97ae007cd35e8068378"} Jan 28 07:02:04 crc kubenswrapper[4776]: I0128 07:02:04.794587 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fn6dp" event={"ID":"087b920d-366b-475c-85f2-e5512596d3f8","Type":"ContainerStarted","Data":"3f774cecb6158c0aeeee076714ef407746fa04e30e2f8b46c8f37cc6a77bc30d"} Jan 28 07:02:04 crc kubenswrapper[4776]: W0128 07:02:04.897428 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f1d6d84_d95e_4423_a7c1_7fa987beff1c.slice/crio-b0d6e7d595b8fefbc9abddc0b5d2105ac859ef8e74a650943d0af766e9484174 WatchSource:0}: Error finding container b0d6e7d595b8fefbc9abddc0b5d2105ac859ef8e74a650943d0af766e9484174: Status 404 returned error can't find the container with id b0d6e7d595b8fefbc9abddc0b5d2105ac859ef8e74a650943d0af766e9484174 Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.136930 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-49nnl"] Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.181130 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7468d54cd-lsxp4"] Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.353380 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6zfw9" Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.378766 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42f9f56f-3ae1-4281-b6ad-68f0e862165e-catalog-content\") pod \"42f9f56f-3ae1-4281-b6ad-68f0e862165e\" (UID: \"42f9f56f-3ae1-4281-b6ad-68f0e862165e\") " Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.378823 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qhdv\" (UniqueName: \"kubernetes.io/projected/42f9f56f-3ae1-4281-b6ad-68f0e862165e-kube-api-access-6qhdv\") pod \"42f9f56f-3ae1-4281-b6ad-68f0e862165e\" (UID: \"42f9f56f-3ae1-4281-b6ad-68f0e862165e\") " Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.378860 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42f9f56f-3ae1-4281-b6ad-68f0e862165e-utilities\") pod \"42f9f56f-3ae1-4281-b6ad-68f0e862165e\" (UID: \"42f9f56f-3ae1-4281-b6ad-68f0e862165e\") " Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.379925 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42f9f56f-3ae1-4281-b6ad-68f0e862165e-utilities" (OuterVolumeSpecName: "utilities") pod "42f9f56f-3ae1-4281-b6ad-68f0e862165e" (UID: "42f9f56f-3ae1-4281-b6ad-68f0e862165e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.394517 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42f9f56f-3ae1-4281-b6ad-68f0e862165e-kube-api-access-6qhdv" (OuterVolumeSpecName: "kube-api-access-6qhdv") pod "42f9f56f-3ae1-4281-b6ad-68f0e862165e" (UID: "42f9f56f-3ae1-4281-b6ad-68f0e862165e"). InnerVolumeSpecName "kube-api-access-6qhdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.480190 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qhdv\" (UniqueName: \"kubernetes.io/projected/42f9f56f-3ae1-4281-b6ad-68f0e862165e-kube-api-access-6qhdv\") on node \"crc\" DevicePath \"\"" Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.480220 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42f9f56f-3ae1-4281-b6ad-68f0e862165e-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.554724 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42f9f56f-3ae1-4281-b6ad-68f0e862165e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42f9f56f-3ae1-4281-b6ad-68f0e862165e" (UID: "42f9f56f-3ae1-4281-b6ad-68f0e862165e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.580813 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42f9f56f-3ae1-4281-b6ad-68f0e862165e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.803339 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7468d54cd-lsxp4" event={"ID":"e5a0d0f0-208f-4bf7-963a-d5e36660a786","Type":"ContainerStarted","Data":"226c4a678957a2977a7a83e6027d956cd77d1f2c756e4418761bbdcbff3717a5"} Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.804037 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7468d54cd-lsxp4" event={"ID":"e5a0d0f0-208f-4bf7-963a-d5e36660a786","Type":"ContainerStarted","Data":"0bfcc2091069a8f3b1b29387ed39c765b3e31796745445a2cd26eac03dbad580"} Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.805560 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-qgksg" event={"ID":"2f1d6d84-d95e-4423-a7c1-7fa987beff1c","Type":"ContainerStarted","Data":"b0d6e7d595b8fefbc9abddc0b5d2105ac859ef8e74a650943d0af766e9484174"} Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.808620 4776 generic.go:334] "Generic (PLEG): container finished" podID="42f9f56f-3ae1-4281-b6ad-68f0e862165e" containerID="faf7c3a8af97638440acd79834db67218f93ac4edc2f9e441ae219c220dd067e" exitCode=0 Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.808690 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zfw9" event={"ID":"42f9f56f-3ae1-4281-b6ad-68f0e862165e","Type":"ContainerDied","Data":"faf7c3a8af97638440acd79834db67218f93ac4edc2f9e441ae219c220dd067e"} Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.808722 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zfw9" event={"ID":"42f9f56f-3ae1-4281-b6ad-68f0e862165e","Type":"ContainerDied","Data":"613e732be4ab2cdd243c97031deabd7f436ca659f3e820ef07cb7d83555ff04f"} Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.808739 4776 scope.go:117] "RemoveContainer" containerID="faf7c3a8af97638440acd79834db67218f93ac4edc2f9e441ae219c220dd067e" Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.808878 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6zfw9" Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.817350 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-49nnl" event={"ID":"59581e1b-5fa1-4649-b461-20815879a250","Type":"ContainerStarted","Data":"c7319407d509ad3d3dd196bdd5fdf2e8913bb70f78f9bb60068c0304f72c8155"} Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.830603 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7468d54cd-lsxp4" podStartSLOduration=1.830528076 podStartE2EDuration="1.830528076s" podCreationTimestamp="2026-01-28 07:02:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:02:05.824715647 +0000 UTC m=+697.240375827" watchObservedRunningTime="2026-01-28 07:02:05.830528076 +0000 UTC m=+697.246188246" Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.846746 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6zfw9"] Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.856599 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6zfw9"] Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.860917 4776 scope.go:117] "RemoveContainer" containerID="83043eae23bac324ec85a2fc54453d7ac3b5c5d8c3010de391ba5e7353e649af" Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.879245 4776 scope.go:117] "RemoveContainer" containerID="e75a8127747d0c14afd0f2559a16f011fc36fd29539ee89994f7c340f5dfed1b" Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.899331 4776 scope.go:117] "RemoveContainer" containerID="faf7c3a8af97638440acd79834db67218f93ac4edc2f9e441ae219c220dd067e" Jan 28 07:02:05 crc kubenswrapper[4776]: E0128 07:02:05.900749 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faf7c3a8af97638440acd79834db67218f93ac4edc2f9e441ae219c220dd067e\": container with ID starting with faf7c3a8af97638440acd79834db67218f93ac4edc2f9e441ae219c220dd067e not found: ID does not exist" containerID="faf7c3a8af97638440acd79834db67218f93ac4edc2f9e441ae219c220dd067e" Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.900790 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf7c3a8af97638440acd79834db67218f93ac4edc2f9e441ae219c220dd067e"} err="failed to get container status \"faf7c3a8af97638440acd79834db67218f93ac4edc2f9e441ae219c220dd067e\": rpc error: code = NotFound desc = could not find container \"faf7c3a8af97638440acd79834db67218f93ac4edc2f9e441ae219c220dd067e\": container with ID starting with faf7c3a8af97638440acd79834db67218f93ac4edc2f9e441ae219c220dd067e not found: ID does not exist" Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.900817 4776 scope.go:117] "RemoveContainer" containerID="83043eae23bac324ec85a2fc54453d7ac3b5c5d8c3010de391ba5e7353e649af" Jan 28 07:02:05 crc kubenswrapper[4776]: E0128 07:02:05.901517 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83043eae23bac324ec85a2fc54453d7ac3b5c5d8c3010de391ba5e7353e649af\": container with ID starting with 83043eae23bac324ec85a2fc54453d7ac3b5c5d8c3010de391ba5e7353e649af not found: ID does not exist" containerID="83043eae23bac324ec85a2fc54453d7ac3b5c5d8c3010de391ba5e7353e649af" Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.901589 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83043eae23bac324ec85a2fc54453d7ac3b5c5d8c3010de391ba5e7353e649af"} err="failed to get container status \"83043eae23bac324ec85a2fc54453d7ac3b5c5d8c3010de391ba5e7353e649af\": rpc error: code = NotFound desc = could not find container \"83043eae23bac324ec85a2fc54453d7ac3b5c5d8c3010de391ba5e7353e649af\": container with ID starting with 83043eae23bac324ec85a2fc54453d7ac3b5c5d8c3010de391ba5e7353e649af not found: ID does not exist" Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.901632 4776 scope.go:117] "RemoveContainer" containerID="e75a8127747d0c14afd0f2559a16f011fc36fd29539ee89994f7c340f5dfed1b" Jan 28 07:02:05 crc kubenswrapper[4776]: E0128 07:02:05.902296 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e75a8127747d0c14afd0f2559a16f011fc36fd29539ee89994f7c340f5dfed1b\": container with ID starting with e75a8127747d0c14afd0f2559a16f011fc36fd29539ee89994f7c340f5dfed1b not found: ID does not exist" containerID="e75a8127747d0c14afd0f2559a16f011fc36fd29539ee89994f7c340f5dfed1b" Jan 28 07:02:05 crc kubenswrapper[4776]: I0128 07:02:05.902334 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e75a8127747d0c14afd0f2559a16f011fc36fd29539ee89994f7c340f5dfed1b"} err="failed to get container status \"e75a8127747d0c14afd0f2559a16f011fc36fd29539ee89994f7c340f5dfed1b\": rpc error: code = NotFound desc = could not find container \"e75a8127747d0c14afd0f2559a16f011fc36fd29539ee89994f7c340f5dfed1b\": container with ID starting with e75a8127747d0c14afd0f2559a16f011fc36fd29539ee89994f7c340f5dfed1b not found: ID does not exist" Jan 28 07:02:07 crc kubenswrapper[4776]: I0128 07:02:07.316435 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42f9f56f-3ae1-4281-b6ad-68f0e862165e" path="/var/lib/kubelet/pods/42f9f56f-3ae1-4281-b6ad-68f0e862165e/volumes" Jan 28 07:02:08 crc kubenswrapper[4776]: I0128 07:02:08.844510 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mjfqk" event={"ID":"ff12b52a-7e92-45bd-afd9-e0b577a8607d","Type":"ContainerStarted","Data":"8101a0044d0b7cdcd093e770af01b4907c941fa5505124fbe689d41ba26abb93"} Jan 28 07:02:08 crc kubenswrapper[4776]: I0128 07:02:08.844730 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mjfqk" Jan 28 07:02:08 crc kubenswrapper[4776]: I0128 07:02:08.849256 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-49nnl" event={"ID":"59581e1b-5fa1-4649-b461-20815879a250","Type":"ContainerStarted","Data":"b0e057f4c3139b066cd11e3991eb68b62f72b14c2f89d55ffb6102f84bafc3a1"} Jan 28 07:02:08 crc kubenswrapper[4776]: I0128 07:02:08.852281 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-qgksg" event={"ID":"2f1d6d84-d95e-4423-a7c1-7fa987beff1c","Type":"ContainerStarted","Data":"2f1c1b20c9eae6b6f59a651baffef1277c40ea1f0ec1df89ce7247af192609e0"} Jan 28 07:02:08 crc kubenswrapper[4776]: I0128 07:02:08.876264 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mjfqk" podStartSLOduration=2.708486302 podStartE2EDuration="5.876243352s" podCreationTimestamp="2026-01-28 07:02:03 +0000 UTC" firstStartedPulling="2026-01-28 07:02:04.600809824 +0000 UTC m=+696.016469984" lastFinishedPulling="2026-01-28 07:02:07.768566874 +0000 UTC m=+699.184227034" observedRunningTime="2026-01-28 07:02:08.871526912 +0000 UTC m=+700.287187102" watchObservedRunningTime="2026-01-28 07:02:08.876243352 +0000 UTC m=+700.291903532" Jan 28 07:02:08 crc kubenswrapper[4776]: I0128 07:02:08.905521 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-49nnl" podStartSLOduration=2.303341931 podStartE2EDuration="4.905502257s" podCreationTimestamp="2026-01-28 07:02:04 +0000 UTC" firstStartedPulling="2026-01-28 07:02:05.16578721 +0000 UTC m=+696.581447370" lastFinishedPulling="2026-01-28 07:02:07.767947536 +0000 UTC m=+699.183607696" observedRunningTime="2026-01-28 07:02:08.889296661 +0000 UTC m=+700.304956841" watchObservedRunningTime="2026-01-28 07:02:08.905502257 +0000 UTC m=+700.321162407" Jan 28 07:02:10 crc kubenswrapper[4776]: I0128 07:02:10.872668 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-qgksg" event={"ID":"2f1d6d84-d95e-4423-a7c1-7fa987beff1c","Type":"ContainerStarted","Data":"a1619d28f86e10ba8726c4ce6d2a7516fccaf06b2eb2e8db74bcfe1af8cfc5a3"} Jan 28 07:02:10 crc kubenswrapper[4776]: I0128 07:02:10.907533 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-qgksg" podStartSLOduration=3.051588391 podStartE2EDuration="7.907505391s" podCreationTimestamp="2026-01-28 07:02:03 +0000 UTC" firstStartedPulling="2026-01-28 07:02:04.903730388 +0000 UTC m=+696.319390548" lastFinishedPulling="2026-01-28 07:02:09.759647388 +0000 UTC m=+701.175307548" observedRunningTime="2026-01-28 07:02:10.894097492 +0000 UTC m=+702.309757692" watchObservedRunningTime="2026-01-28 07:02:10.907505391 +0000 UTC m=+702.323165591" Jan 28 07:02:14 crc kubenswrapper[4776]: I0128 07:02:14.605170 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:14 crc kubenswrapper[4776]: I0128 07:02:14.607915 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:14 crc kubenswrapper[4776]: I0128 07:02:14.613305 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:14 crc kubenswrapper[4776]: I0128 07:02:14.909361 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7468d54cd-lsxp4" Jan 28 07:02:14 crc kubenswrapper[4776]: I0128 07:02:14.981929 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-p5b6p"] Jan 28 07:02:17 crc kubenswrapper[4776]: I0128 07:02:17.928712 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fn6dp" event={"ID":"087b920d-366b-475c-85f2-e5512596d3f8","Type":"ContainerStarted","Data":"c75d8e7ff8248f7795c20e4df05ff052fe9ff698c392dffe484f59e65738d0ff"} Jan 28 07:02:17 crc kubenswrapper[4776]: I0128 07:02:17.929289 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-fn6dp" Jan 28 07:02:17 crc kubenswrapper[4776]: I0128 07:02:17.962024 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-fn6dp" podStartSLOduration=2.614329455 podStartE2EDuration="14.962000539s" podCreationTimestamp="2026-01-28 07:02:03 +0000 UTC" firstStartedPulling="2026-01-28 07:02:04.311417341 +0000 UTC m=+695.727077501" lastFinishedPulling="2026-01-28 07:02:16.659088395 +0000 UTC m=+708.074748585" observedRunningTime="2026-01-28 07:02:17.950941225 +0000 UTC m=+709.366601385" watchObservedRunningTime="2026-01-28 07:02:17.962000539 +0000 UTC m=+709.377660729" Jan 28 07:02:24 crc kubenswrapper[4776]: I0128 07:02:24.262930 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-mjfqk" Jan 28 07:02:24 crc kubenswrapper[4776]: I0128 07:02:24.318498 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-fn6dp" Jan 28 07:02:38 crc kubenswrapper[4776]: I0128 07:02:38.864475 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9"] Jan 28 07:02:38 crc kubenswrapper[4776]: E0128 07:02:38.865764 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42f9f56f-3ae1-4281-b6ad-68f0e862165e" containerName="registry-server" Jan 28 07:02:38 crc kubenswrapper[4776]: I0128 07:02:38.865782 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="42f9f56f-3ae1-4281-b6ad-68f0e862165e" containerName="registry-server" Jan 28 07:02:38 crc kubenswrapper[4776]: E0128 07:02:38.865796 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42f9f56f-3ae1-4281-b6ad-68f0e862165e" containerName="extract-utilities" Jan 28 07:02:38 crc kubenswrapper[4776]: I0128 07:02:38.865803 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="42f9f56f-3ae1-4281-b6ad-68f0e862165e" containerName="extract-utilities" Jan 28 07:02:38 crc kubenswrapper[4776]: E0128 07:02:38.865819 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42f9f56f-3ae1-4281-b6ad-68f0e862165e" containerName="extract-content" Jan 28 07:02:38 crc kubenswrapper[4776]: I0128 07:02:38.865826 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="42f9f56f-3ae1-4281-b6ad-68f0e862165e" containerName="extract-content" Jan 28 07:02:38 crc kubenswrapper[4776]: I0128 07:02:38.865968 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="42f9f56f-3ae1-4281-b6ad-68f0e862165e" containerName="registry-server" Jan 28 07:02:38 crc kubenswrapper[4776]: I0128 07:02:38.867023 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" Jan 28 07:02:38 crc kubenswrapper[4776]: I0128 07:02:38.870904 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 28 07:02:38 crc kubenswrapper[4776]: I0128 07:02:38.875720 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9"] Jan 28 07:02:38 crc kubenswrapper[4776]: I0128 07:02:38.965420 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/212e186f-3642-483d-adf6-00dfaf77ca5f-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9\" (UID: \"212e186f-3642-483d-adf6-00dfaf77ca5f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" Jan 28 07:02:38 crc kubenswrapper[4776]: I0128 07:02:38.965479 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrzqz\" (UniqueName: \"kubernetes.io/projected/212e186f-3642-483d-adf6-00dfaf77ca5f-kube-api-access-qrzqz\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9\" (UID: \"212e186f-3642-483d-adf6-00dfaf77ca5f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" Jan 28 07:02:38 crc kubenswrapper[4776]: I0128 07:02:38.965775 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/212e186f-3642-483d-adf6-00dfaf77ca5f-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9\" (UID: \"212e186f-3642-483d-adf6-00dfaf77ca5f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" Jan 28 07:02:39 crc kubenswrapper[4776]: I0128 07:02:39.067480 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/212e186f-3642-483d-adf6-00dfaf77ca5f-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9\" (UID: \"212e186f-3642-483d-adf6-00dfaf77ca5f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" Jan 28 07:02:39 crc kubenswrapper[4776]: I0128 07:02:39.067658 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/212e186f-3642-483d-adf6-00dfaf77ca5f-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9\" (UID: \"212e186f-3642-483d-adf6-00dfaf77ca5f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" Jan 28 07:02:39 crc kubenswrapper[4776]: I0128 07:02:39.067708 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrzqz\" (UniqueName: \"kubernetes.io/projected/212e186f-3642-483d-adf6-00dfaf77ca5f-kube-api-access-qrzqz\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9\" (UID: \"212e186f-3642-483d-adf6-00dfaf77ca5f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" Jan 28 07:02:39 crc kubenswrapper[4776]: I0128 07:02:39.068391 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/212e186f-3642-483d-adf6-00dfaf77ca5f-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9\" (UID: \"212e186f-3642-483d-adf6-00dfaf77ca5f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" Jan 28 07:02:39 crc kubenswrapper[4776]: I0128 07:02:39.068433 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/212e186f-3642-483d-adf6-00dfaf77ca5f-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9\" (UID: \"212e186f-3642-483d-adf6-00dfaf77ca5f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" Jan 28 07:02:39 crc kubenswrapper[4776]: I0128 07:02:39.104611 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrzqz\" (UniqueName: \"kubernetes.io/projected/212e186f-3642-483d-adf6-00dfaf77ca5f-kube-api-access-qrzqz\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9\" (UID: \"212e186f-3642-483d-adf6-00dfaf77ca5f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" Jan 28 07:02:39 crc kubenswrapper[4776]: I0128 07:02:39.190644 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" Jan 28 07:02:39 crc kubenswrapper[4776]: I0128 07:02:39.493338 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9"] Jan 28 07:02:39 crc kubenswrapper[4776]: W0128 07:02:39.495282 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod212e186f_3642_483d_adf6_00dfaf77ca5f.slice/crio-26aa740ba4982b8f8c9a3244bc1bef30cdaa729f38bbddfd9c6792e45b50f95e WatchSource:0}: Error finding container 26aa740ba4982b8f8c9a3244bc1bef30cdaa729f38bbddfd9c6792e45b50f95e: Status 404 returned error can't find the container with id 26aa740ba4982b8f8c9a3244bc1bef30cdaa729f38bbddfd9c6792e45b50f95e Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.036629 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-p5b6p" podUID="a6e763c5-5d99-4374-9ade-5ac3ff4b9817" containerName="console" containerID="cri-o://ed924b20152d9bdae24c3c1cf7e7602c24fcc7baa86710f20557b7388ab1cfd7" gracePeriod=15 Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.098463 4776 generic.go:334] "Generic (PLEG): container finished" podID="212e186f-3642-483d-adf6-00dfaf77ca5f" containerID="fad6f9f8f5dde555c1bf1365b94fd8f563df9d8c2945f1be5800a0e48cb7690e" exitCode=0 Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.098522 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" event={"ID":"212e186f-3642-483d-adf6-00dfaf77ca5f","Type":"ContainerDied","Data":"fad6f9f8f5dde555c1bf1365b94fd8f563df9d8c2945f1be5800a0e48cb7690e"} Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.098589 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" event={"ID":"212e186f-3642-483d-adf6-00dfaf77ca5f","Type":"ContainerStarted","Data":"26aa740ba4982b8f8c9a3244bc1bef30cdaa729f38bbddfd9c6792e45b50f95e"} Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.485695 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-p5b6p_a6e763c5-5d99-4374-9ade-5ac3ff4b9817/console/0.log" Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.485936 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.586105 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-oauth-serving-cert\") pod \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.586356 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-config\") pod \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.586436 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-trusted-ca-bundle\") pod \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.586506 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-service-ca\") pod \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.586612 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-oauth-config\") pod \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.586755 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm97q\" (UniqueName: \"kubernetes.io/projected/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-kube-api-access-lm97q\") pod \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.586829 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-serving-cert\") pod \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\" (UID: \"a6e763c5-5d99-4374-9ade-5ac3ff4b9817\") " Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.586989 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a6e763c5-5d99-4374-9ade-5ac3ff4b9817" (UID: "a6e763c5-5d99-4374-9ade-5ac3ff4b9817"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.587034 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-config" (OuterVolumeSpecName: "console-config") pod "a6e763c5-5d99-4374-9ade-5ac3ff4b9817" (UID: "a6e763c5-5d99-4374-9ade-5ac3ff4b9817"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.587059 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a6e763c5-5d99-4374-9ade-5ac3ff4b9817" (UID: "a6e763c5-5d99-4374-9ade-5ac3ff4b9817"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.587123 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-service-ca" (OuterVolumeSpecName: "service-ca") pod "a6e763c5-5d99-4374-9ade-5ac3ff4b9817" (UID: "a6e763c5-5d99-4374-9ade-5ac3ff4b9817"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.587352 4776 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.587377 4776 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.587390 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.587402 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.592567 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a6e763c5-5d99-4374-9ade-5ac3ff4b9817" (UID: "a6e763c5-5d99-4374-9ade-5ac3ff4b9817"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.592578 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-kube-api-access-lm97q" (OuterVolumeSpecName: "kube-api-access-lm97q") pod "a6e763c5-5d99-4374-9ade-5ac3ff4b9817" (UID: "a6e763c5-5d99-4374-9ade-5ac3ff4b9817"). InnerVolumeSpecName "kube-api-access-lm97q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.593768 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a6e763c5-5d99-4374-9ade-5ac3ff4b9817" (UID: "a6e763c5-5d99-4374-9ade-5ac3ff4b9817"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.688855 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm97q\" (UniqueName: \"kubernetes.io/projected/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-kube-api-access-lm97q\") on node \"crc\" DevicePath \"\"" Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.688890 4776 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 07:02:40 crc kubenswrapper[4776]: I0128 07:02:40.688898 4776 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6e763c5-5d99-4374-9ade-5ac3ff4b9817-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:02:41 crc kubenswrapper[4776]: I0128 07:02:41.106602 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-p5b6p_a6e763c5-5d99-4374-9ade-5ac3ff4b9817/console/0.log" Jan 28 07:02:41 crc kubenswrapper[4776]: I0128 07:02:41.106657 4776 generic.go:334] "Generic (PLEG): container finished" podID="a6e763c5-5d99-4374-9ade-5ac3ff4b9817" containerID="ed924b20152d9bdae24c3c1cf7e7602c24fcc7baa86710f20557b7388ab1cfd7" exitCode=2 Jan 28 07:02:41 crc kubenswrapper[4776]: I0128 07:02:41.106690 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-p5b6p" event={"ID":"a6e763c5-5d99-4374-9ade-5ac3ff4b9817","Type":"ContainerDied","Data":"ed924b20152d9bdae24c3c1cf7e7602c24fcc7baa86710f20557b7388ab1cfd7"} Jan 28 07:02:41 crc kubenswrapper[4776]: I0128 07:02:41.106722 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-p5b6p" event={"ID":"a6e763c5-5d99-4374-9ade-5ac3ff4b9817","Type":"ContainerDied","Data":"3d89ab939b57aedd844c91620553f4e231a51d0fdc5bace9ea9e7914f20dd8cd"} Jan 28 07:02:41 crc kubenswrapper[4776]: I0128 07:02:41.106745 4776 scope.go:117] "RemoveContainer" containerID="ed924b20152d9bdae24c3c1cf7e7602c24fcc7baa86710f20557b7388ab1cfd7" Jan 28 07:02:41 crc kubenswrapper[4776]: I0128 07:02:41.106777 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-p5b6p" Jan 28 07:02:41 crc kubenswrapper[4776]: I0128 07:02:41.136666 4776 scope.go:117] "RemoveContainer" containerID="ed924b20152d9bdae24c3c1cf7e7602c24fcc7baa86710f20557b7388ab1cfd7" Jan 28 07:02:41 crc kubenswrapper[4776]: E0128 07:02:41.137683 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed924b20152d9bdae24c3c1cf7e7602c24fcc7baa86710f20557b7388ab1cfd7\": container with ID starting with ed924b20152d9bdae24c3c1cf7e7602c24fcc7baa86710f20557b7388ab1cfd7 not found: ID does not exist" containerID="ed924b20152d9bdae24c3c1cf7e7602c24fcc7baa86710f20557b7388ab1cfd7" Jan 28 07:02:41 crc kubenswrapper[4776]: I0128 07:02:41.137793 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed924b20152d9bdae24c3c1cf7e7602c24fcc7baa86710f20557b7388ab1cfd7"} err="failed to get container status \"ed924b20152d9bdae24c3c1cf7e7602c24fcc7baa86710f20557b7388ab1cfd7\": rpc error: code = NotFound desc = could not find container \"ed924b20152d9bdae24c3c1cf7e7602c24fcc7baa86710f20557b7388ab1cfd7\": container with ID starting with ed924b20152d9bdae24c3c1cf7e7602c24fcc7baa86710f20557b7388ab1cfd7 not found: ID does not exist" Jan 28 07:02:41 crc kubenswrapper[4776]: I0128 07:02:41.163491 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-p5b6p"] Jan 28 07:02:41 crc kubenswrapper[4776]: I0128 07:02:41.167703 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-p5b6p"] Jan 28 07:02:41 crc kubenswrapper[4776]: I0128 07:02:41.318742 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6e763c5-5d99-4374-9ade-5ac3ff4b9817" path="/var/lib/kubelet/pods/a6e763c5-5d99-4374-9ade-5ac3ff4b9817/volumes" Jan 28 07:02:42 crc kubenswrapper[4776]: I0128 07:02:42.118270 4776 generic.go:334] "Generic (PLEG): container finished" podID="212e186f-3642-483d-adf6-00dfaf77ca5f" containerID="996e7a883f03f6fdece9e1e8eb6136ebdca0811f284de76a8bf84c6617338c76" exitCode=0 Jan 28 07:02:42 crc kubenswrapper[4776]: I0128 07:02:42.118341 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" event={"ID":"212e186f-3642-483d-adf6-00dfaf77ca5f","Type":"ContainerDied","Data":"996e7a883f03f6fdece9e1e8eb6136ebdca0811f284de76a8bf84c6617338c76"} Jan 28 07:02:43 crc kubenswrapper[4776]: I0128 07:02:43.129922 4776 generic.go:334] "Generic (PLEG): container finished" podID="212e186f-3642-483d-adf6-00dfaf77ca5f" containerID="f7f6264e4d79e74a59afbfb5bb5e1ebc30b81d25073de1e4fc21305469dfc80a" exitCode=0 Jan 28 07:02:43 crc kubenswrapper[4776]: I0128 07:02:43.129966 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" event={"ID":"212e186f-3642-483d-adf6-00dfaf77ca5f","Type":"ContainerDied","Data":"f7f6264e4d79e74a59afbfb5bb5e1ebc30b81d25073de1e4fc21305469dfc80a"} Jan 28 07:02:44 crc kubenswrapper[4776]: I0128 07:02:44.418185 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" Jan 28 07:02:44 crc kubenswrapper[4776]: I0128 07:02:44.438300 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/212e186f-3642-483d-adf6-00dfaf77ca5f-util\") pod \"212e186f-3642-483d-adf6-00dfaf77ca5f\" (UID: \"212e186f-3642-483d-adf6-00dfaf77ca5f\") " Jan 28 07:02:44 crc kubenswrapper[4776]: I0128 07:02:44.438353 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/212e186f-3642-483d-adf6-00dfaf77ca5f-bundle\") pod \"212e186f-3642-483d-adf6-00dfaf77ca5f\" (UID: \"212e186f-3642-483d-adf6-00dfaf77ca5f\") " Jan 28 07:02:44 crc kubenswrapper[4776]: I0128 07:02:44.438427 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrzqz\" (UniqueName: \"kubernetes.io/projected/212e186f-3642-483d-adf6-00dfaf77ca5f-kube-api-access-qrzqz\") pod \"212e186f-3642-483d-adf6-00dfaf77ca5f\" (UID: \"212e186f-3642-483d-adf6-00dfaf77ca5f\") " Jan 28 07:02:44 crc kubenswrapper[4776]: I0128 07:02:44.439587 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/212e186f-3642-483d-adf6-00dfaf77ca5f-bundle" (OuterVolumeSpecName: "bundle") pod "212e186f-3642-483d-adf6-00dfaf77ca5f" (UID: "212e186f-3642-483d-adf6-00dfaf77ca5f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:02:44 crc kubenswrapper[4776]: I0128 07:02:44.447793 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/212e186f-3642-483d-adf6-00dfaf77ca5f-kube-api-access-qrzqz" (OuterVolumeSpecName: "kube-api-access-qrzqz") pod "212e186f-3642-483d-adf6-00dfaf77ca5f" (UID: "212e186f-3642-483d-adf6-00dfaf77ca5f"). InnerVolumeSpecName "kube-api-access-qrzqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:02:44 crc kubenswrapper[4776]: I0128 07:02:44.451362 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/212e186f-3642-483d-adf6-00dfaf77ca5f-util" (OuterVolumeSpecName: "util") pod "212e186f-3642-483d-adf6-00dfaf77ca5f" (UID: "212e186f-3642-483d-adf6-00dfaf77ca5f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:02:44 crc kubenswrapper[4776]: I0128 07:02:44.539689 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrzqz\" (UniqueName: \"kubernetes.io/projected/212e186f-3642-483d-adf6-00dfaf77ca5f-kube-api-access-qrzqz\") on node \"crc\" DevicePath \"\"" Jan 28 07:02:44 crc kubenswrapper[4776]: I0128 07:02:44.539746 4776 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/212e186f-3642-483d-adf6-00dfaf77ca5f-util\") on node \"crc\" DevicePath \"\"" Jan 28 07:02:44 crc kubenswrapper[4776]: I0128 07:02:44.539765 4776 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/212e186f-3642-483d-adf6-00dfaf77ca5f-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:02:45 crc kubenswrapper[4776]: I0128 07:02:45.150353 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" event={"ID":"212e186f-3642-483d-adf6-00dfaf77ca5f","Type":"ContainerDied","Data":"26aa740ba4982b8f8c9a3244bc1bef30cdaa729f38bbddfd9c6792e45b50f95e"} Jan 28 07:02:45 crc kubenswrapper[4776]: I0128 07:02:45.151137 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26aa740ba4982b8f8c9a3244bc1bef30cdaa729f38bbddfd9c6792e45b50f95e" Jan 28 07:02:45 crc kubenswrapper[4776]: I0128 07:02:45.150730 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9" Jan 28 07:02:53 crc kubenswrapper[4776]: I0128 07:02:53.911788 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-74f4f84-5s97b"] Jan 28 07:02:53 crc kubenswrapper[4776]: E0128 07:02:53.912630 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="212e186f-3642-483d-adf6-00dfaf77ca5f" containerName="pull" Jan 28 07:02:53 crc kubenswrapper[4776]: I0128 07:02:53.912647 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="212e186f-3642-483d-adf6-00dfaf77ca5f" containerName="pull" Jan 28 07:02:53 crc kubenswrapper[4776]: E0128 07:02:53.912663 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e763c5-5d99-4374-9ade-5ac3ff4b9817" containerName="console" Jan 28 07:02:53 crc kubenswrapper[4776]: I0128 07:02:53.912670 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e763c5-5d99-4374-9ade-5ac3ff4b9817" containerName="console" Jan 28 07:02:53 crc kubenswrapper[4776]: E0128 07:02:53.912680 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="212e186f-3642-483d-adf6-00dfaf77ca5f" containerName="util" Jan 28 07:02:53 crc kubenswrapper[4776]: I0128 07:02:53.912688 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="212e186f-3642-483d-adf6-00dfaf77ca5f" containerName="util" Jan 28 07:02:53 crc kubenswrapper[4776]: E0128 07:02:53.912704 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="212e186f-3642-483d-adf6-00dfaf77ca5f" containerName="extract" Jan 28 07:02:53 crc kubenswrapper[4776]: I0128 07:02:53.912712 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="212e186f-3642-483d-adf6-00dfaf77ca5f" containerName="extract" Jan 28 07:02:53 crc kubenswrapper[4776]: I0128 07:02:53.912834 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="212e186f-3642-483d-adf6-00dfaf77ca5f" containerName="extract" Jan 28 07:02:53 crc kubenswrapper[4776]: I0128 07:02:53.912854 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e763c5-5d99-4374-9ade-5ac3ff4b9817" containerName="console" Jan 28 07:02:53 crc kubenswrapper[4776]: I0128 07:02:53.913357 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-74f4f84-5s97b" Jan 28 07:02:53 crc kubenswrapper[4776]: I0128 07:02:53.914997 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 28 07:02:53 crc kubenswrapper[4776]: I0128 07:02:53.915507 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 28 07:02:53 crc kubenswrapper[4776]: I0128 07:02:53.915831 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 28 07:02:53 crc kubenswrapper[4776]: I0128 07:02:53.915980 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 28 07:02:53 crc kubenswrapper[4776]: I0128 07:02:53.916803 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-44dc4" Jan 28 07:02:53 crc kubenswrapper[4776]: I0128 07:02:53.922330 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-74f4f84-5s97b"] Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.064798 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6w76\" (UniqueName: \"kubernetes.io/projected/b127309b-519f-42d4-9aca-30708ae2aae1-kube-api-access-t6w76\") pod \"metallb-operator-controller-manager-74f4f84-5s97b\" (UID: \"b127309b-519f-42d4-9aca-30708ae2aae1\") " pod="metallb-system/metallb-operator-controller-manager-74f4f84-5s97b" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.064905 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b127309b-519f-42d4-9aca-30708ae2aae1-webhook-cert\") pod \"metallb-operator-controller-manager-74f4f84-5s97b\" (UID: \"b127309b-519f-42d4-9aca-30708ae2aae1\") " pod="metallb-system/metallb-operator-controller-manager-74f4f84-5s97b" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.064942 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b127309b-519f-42d4-9aca-30708ae2aae1-apiservice-cert\") pod \"metallb-operator-controller-manager-74f4f84-5s97b\" (UID: \"b127309b-519f-42d4-9aca-30708ae2aae1\") " pod="metallb-system/metallb-operator-controller-manager-74f4f84-5s97b" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.166427 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6w76\" (UniqueName: \"kubernetes.io/projected/b127309b-519f-42d4-9aca-30708ae2aae1-kube-api-access-t6w76\") pod \"metallb-operator-controller-manager-74f4f84-5s97b\" (UID: \"b127309b-519f-42d4-9aca-30708ae2aae1\") " pod="metallb-system/metallb-operator-controller-manager-74f4f84-5s97b" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.166503 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b127309b-519f-42d4-9aca-30708ae2aae1-webhook-cert\") pod \"metallb-operator-controller-manager-74f4f84-5s97b\" (UID: \"b127309b-519f-42d4-9aca-30708ae2aae1\") " pod="metallb-system/metallb-operator-controller-manager-74f4f84-5s97b" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.166537 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b127309b-519f-42d4-9aca-30708ae2aae1-apiservice-cert\") pod \"metallb-operator-controller-manager-74f4f84-5s97b\" (UID: \"b127309b-519f-42d4-9aca-30708ae2aae1\") " pod="metallb-system/metallb-operator-controller-manager-74f4f84-5s97b" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.174227 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b127309b-519f-42d4-9aca-30708ae2aae1-webhook-cert\") pod \"metallb-operator-controller-manager-74f4f84-5s97b\" (UID: \"b127309b-519f-42d4-9aca-30708ae2aae1\") " pod="metallb-system/metallb-operator-controller-manager-74f4f84-5s97b" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.184172 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b127309b-519f-42d4-9aca-30708ae2aae1-apiservice-cert\") pod \"metallb-operator-controller-manager-74f4f84-5s97b\" (UID: \"b127309b-519f-42d4-9aca-30708ae2aae1\") " pod="metallb-system/metallb-operator-controller-manager-74f4f84-5s97b" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.190100 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6w76\" (UniqueName: \"kubernetes.io/projected/b127309b-519f-42d4-9aca-30708ae2aae1-kube-api-access-t6w76\") pod \"metallb-operator-controller-manager-74f4f84-5s97b\" (UID: \"b127309b-519f-42d4-9aca-30708ae2aae1\") " pod="metallb-system/metallb-operator-controller-manager-74f4f84-5s97b" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.227228 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-74f4f84-5s97b" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.361884 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6"] Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.373871 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.377048 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.377054 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.377163 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-v4vwc" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.386083 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6"] Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.472293 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10b029bb-8821-4602-9b1d-910d59efc97a-webhook-cert\") pod \"metallb-operator-webhook-server-f68d4f57-8pgd6\" (UID: \"10b029bb-8821-4602-9b1d-910d59efc97a\") " pod="metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.472374 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10b029bb-8821-4602-9b1d-910d59efc97a-apiservice-cert\") pod \"metallb-operator-webhook-server-f68d4f57-8pgd6\" (UID: \"10b029bb-8821-4602-9b1d-910d59efc97a\") " pod="metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.472453 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdbb5\" (UniqueName: \"kubernetes.io/projected/10b029bb-8821-4602-9b1d-910d59efc97a-kube-api-access-sdbb5\") pod \"metallb-operator-webhook-server-f68d4f57-8pgd6\" (UID: \"10b029bb-8821-4602-9b1d-910d59efc97a\") " pod="metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.574158 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdbb5\" (UniqueName: \"kubernetes.io/projected/10b029bb-8821-4602-9b1d-910d59efc97a-kube-api-access-sdbb5\") pod \"metallb-operator-webhook-server-f68d4f57-8pgd6\" (UID: \"10b029bb-8821-4602-9b1d-910d59efc97a\") " pod="metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.574236 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10b029bb-8821-4602-9b1d-910d59efc97a-webhook-cert\") pod \"metallb-operator-webhook-server-f68d4f57-8pgd6\" (UID: \"10b029bb-8821-4602-9b1d-910d59efc97a\") " pod="metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.574284 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10b029bb-8821-4602-9b1d-910d59efc97a-apiservice-cert\") pod \"metallb-operator-webhook-server-f68d4f57-8pgd6\" (UID: \"10b029bb-8821-4602-9b1d-910d59efc97a\") " pod="metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.582157 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10b029bb-8821-4602-9b1d-910d59efc97a-webhook-cert\") pod \"metallb-operator-webhook-server-f68d4f57-8pgd6\" (UID: \"10b029bb-8821-4602-9b1d-910d59efc97a\") " pod="metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.582877 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10b029bb-8821-4602-9b1d-910d59efc97a-apiservice-cert\") pod \"metallb-operator-webhook-server-f68d4f57-8pgd6\" (UID: \"10b029bb-8821-4602-9b1d-910d59efc97a\") " pod="metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.589875 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdbb5\" (UniqueName: \"kubernetes.io/projected/10b029bb-8821-4602-9b1d-910d59efc97a-kube-api-access-sdbb5\") pod \"metallb-operator-webhook-server-f68d4f57-8pgd6\" (UID: \"10b029bb-8821-4602-9b1d-910d59efc97a\") " pod="metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.694813 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6" Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.713720 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-74f4f84-5s97b"] Jan 28 07:02:54 crc kubenswrapper[4776]: W0128 07:02:54.723236 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb127309b_519f_42d4_9aca_30708ae2aae1.slice/crio-ecc07a5fa6c7c9ddcf1610c08189e27f07b5073b5391e8e3eaec0943e4410ea0 WatchSource:0}: Error finding container ecc07a5fa6c7c9ddcf1610c08189e27f07b5073b5391e8e3eaec0943e4410ea0: Status 404 returned error can't find the container with id ecc07a5fa6c7c9ddcf1610c08189e27f07b5073b5391e8e3eaec0943e4410ea0 Jan 28 07:02:54 crc kubenswrapper[4776]: I0128 07:02:54.904788 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6"] Jan 28 07:02:54 crc kubenswrapper[4776]: W0128 07:02:54.913560 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10b029bb_8821_4602_9b1d_910d59efc97a.slice/crio-4158f795a2e4b7b68835759c391adeae107ac0cf7be7b10bdabb414a985bca71 WatchSource:0}: Error finding container 4158f795a2e4b7b68835759c391adeae107ac0cf7be7b10bdabb414a985bca71: Status 404 returned error can't find the container with id 4158f795a2e4b7b68835759c391adeae107ac0cf7be7b10bdabb414a985bca71 Jan 28 07:02:55 crc kubenswrapper[4776]: I0128 07:02:55.213749 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6" event={"ID":"10b029bb-8821-4602-9b1d-910d59efc97a","Type":"ContainerStarted","Data":"4158f795a2e4b7b68835759c391adeae107ac0cf7be7b10bdabb414a985bca71"} Jan 28 07:02:55 crc kubenswrapper[4776]: I0128 07:02:55.215315 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74f4f84-5s97b" event={"ID":"b127309b-519f-42d4-9aca-30708ae2aae1","Type":"ContainerStarted","Data":"ecc07a5fa6c7c9ddcf1610c08189e27f07b5073b5391e8e3eaec0943e4410ea0"} Jan 28 07:03:00 crc kubenswrapper[4776]: I0128 07:03:00.253211 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74f4f84-5s97b" event={"ID":"b127309b-519f-42d4-9aca-30708ae2aae1","Type":"ContainerStarted","Data":"f3e9d7e12ccf1d310e6381ecb5eb19258f03b592506ddad3681299da065c476c"} Jan 28 07:03:00 crc kubenswrapper[4776]: I0128 07:03:00.254248 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-74f4f84-5s97b" Jan 28 07:03:00 crc kubenswrapper[4776]: I0128 07:03:00.260508 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6" event={"ID":"10b029bb-8821-4602-9b1d-910d59efc97a","Type":"ContainerStarted","Data":"7e067bf86e8b35ff268b649e35fd5997629fe583ddf10327f6066d22b5928503"} Jan 28 07:03:00 crc kubenswrapper[4776]: I0128 07:03:00.260863 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6" Jan 28 07:03:00 crc kubenswrapper[4776]: I0128 07:03:00.287165 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-74f4f84-5s97b" podStartSLOduration=2.427901022 podStartE2EDuration="7.287142313s" podCreationTimestamp="2026-01-28 07:02:53 +0000 UTC" firstStartedPulling="2026-01-28 07:02:54.725945432 +0000 UTC m=+746.141605592" lastFinishedPulling="2026-01-28 07:02:59.585186723 +0000 UTC m=+751.000846883" observedRunningTime="2026-01-28 07:03:00.284788878 +0000 UTC m=+751.700449078" watchObservedRunningTime="2026-01-28 07:03:00.287142313 +0000 UTC m=+751.702802483" Jan 28 07:03:00 crc kubenswrapper[4776]: I0128 07:03:00.316059 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6" podStartSLOduration=1.6236090399999998 podStartE2EDuration="6.316031087s" podCreationTimestamp="2026-01-28 07:02:54 +0000 UTC" firstStartedPulling="2026-01-28 07:02:54.916634732 +0000 UTC m=+746.332294892" lastFinishedPulling="2026-01-28 07:02:59.609056779 +0000 UTC m=+751.024716939" observedRunningTime="2026-01-28 07:03:00.306488035 +0000 UTC m=+751.722148235" watchObservedRunningTime="2026-01-28 07:03:00.316031087 +0000 UTC m=+751.731691287" Jan 28 07:03:14 crc kubenswrapper[4776]: I0128 07:03:14.701377 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6" Jan 28 07:03:33 crc kubenswrapper[4776]: I0128 07:03:33.852518 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:03:33 crc kubenswrapper[4776]: I0128 07:03:33.853375 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:03:34 crc kubenswrapper[4776]: I0128 07:03:34.229624 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-74f4f84-5s97b" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.008487 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-mbw9s"] Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.010086 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-mbw9s" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.016506 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-g4bxr"] Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.018439 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-89t4p" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.018436 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.019637 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.021959 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.022896 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.032890 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-mbw9s"] Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.099142 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-mjlbx"] Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.100005 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mjlbx" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.106887 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.113651 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-5fnm9" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.113749 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.113872 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.118455 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-9qzfc"] Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.119322 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-9qzfc" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.120607 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.129950 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a349654d-030c-4341-b884-8f295ea9dfa9-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-mbw9s\" (UID: \"a349654d-030c-4341-b884-8f295ea9dfa9\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-mbw9s" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.130025 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f824bf65-570f-4d47-8006-8e13fb86368f-frr-sockets\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.130046 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f824bf65-570f-4d47-8006-8e13fb86368f-reloader\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.130066 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwh6s\" (UniqueName: \"kubernetes.io/projected/a349654d-030c-4341-b884-8f295ea9dfa9-kube-api-access-xwh6s\") pod \"frr-k8s-webhook-server-7df86c4f6c-mbw9s\" (UID: \"a349654d-030c-4341-b884-8f295ea9dfa9\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-mbw9s" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.130141 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f824bf65-570f-4d47-8006-8e13fb86368f-frr-conf\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.130184 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f824bf65-570f-4d47-8006-8e13fb86368f-metrics-certs\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.130208 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f824bf65-570f-4d47-8006-8e13fb86368f-metrics\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.130257 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48rcn\" (UniqueName: \"kubernetes.io/projected/f824bf65-570f-4d47-8006-8e13fb86368f-kube-api-access-48rcn\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.130308 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f824bf65-570f-4d47-8006-8e13fb86368f-frr-startup\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.150898 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-9qzfc"] Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.231417 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcvck\" (UniqueName: \"kubernetes.io/projected/896d6757-3340-421c-937a-d6e35e752bdc-kube-api-access-tcvck\") pod \"controller-6968d8fdc4-9qzfc\" (UID: \"896d6757-3340-421c-937a-d6e35e752bdc\") " pod="metallb-system/controller-6968d8fdc4-9qzfc" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.231468 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a349654d-030c-4341-b884-8f295ea9dfa9-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-mbw9s\" (UID: \"a349654d-030c-4341-b884-8f295ea9dfa9\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-mbw9s" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.231578 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rg2w\" (UniqueName: \"kubernetes.io/projected/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-kube-api-access-2rg2w\") pod \"speaker-mjlbx\" (UID: \"6f2ff038-c715-4cff-a872-ac6ae5c7fbff\") " pod="metallb-system/speaker-mjlbx" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.231630 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-memberlist\") pod \"speaker-mjlbx\" (UID: \"6f2ff038-c715-4cff-a872-ac6ae5c7fbff\") " pod="metallb-system/speaker-mjlbx" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.231666 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f824bf65-570f-4d47-8006-8e13fb86368f-frr-sockets\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: E0128 07:03:35.231682 4776 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.231700 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f824bf65-570f-4d47-8006-8e13fb86368f-reloader\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: E0128 07:03:35.231732 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a349654d-030c-4341-b884-8f295ea9dfa9-cert podName:a349654d-030c-4341-b884-8f295ea9dfa9 nodeName:}" failed. No retries permitted until 2026-01-28 07:03:35.731715838 +0000 UTC m=+787.147375998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a349654d-030c-4341-b884-8f295ea9dfa9-cert") pod "frr-k8s-webhook-server-7df86c4f6c-mbw9s" (UID: "a349654d-030c-4341-b884-8f295ea9dfa9") : secret "frr-k8s-webhook-server-cert" not found Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.231745 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwh6s\" (UniqueName: \"kubernetes.io/projected/a349654d-030c-4341-b884-8f295ea9dfa9-kube-api-access-xwh6s\") pod \"frr-k8s-webhook-server-7df86c4f6c-mbw9s\" (UID: \"a349654d-030c-4341-b884-8f295ea9dfa9\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-mbw9s" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.231784 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f824bf65-570f-4d47-8006-8e13fb86368f-frr-conf\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.231808 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-metallb-excludel2\") pod \"speaker-mjlbx\" (UID: \"6f2ff038-c715-4cff-a872-ac6ae5c7fbff\") " pod="metallb-system/speaker-mjlbx" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.231827 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f824bf65-570f-4d47-8006-8e13fb86368f-metrics-certs\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.231845 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f824bf65-570f-4d47-8006-8e13fb86368f-metrics\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.231861 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-metrics-certs\") pod \"speaker-mjlbx\" (UID: \"6f2ff038-c715-4cff-a872-ac6ae5c7fbff\") " pod="metallb-system/speaker-mjlbx" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.231880 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48rcn\" (UniqueName: \"kubernetes.io/projected/f824bf65-570f-4d47-8006-8e13fb86368f-kube-api-access-48rcn\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.231900 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/896d6757-3340-421c-937a-d6e35e752bdc-metrics-certs\") pod \"controller-6968d8fdc4-9qzfc\" (UID: \"896d6757-3340-421c-937a-d6e35e752bdc\") " pod="metallb-system/controller-6968d8fdc4-9qzfc" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.231984 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f824bf65-570f-4d47-8006-8e13fb86368f-frr-startup\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.232029 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/896d6757-3340-421c-937a-d6e35e752bdc-cert\") pod \"controller-6968d8fdc4-9qzfc\" (UID: \"896d6757-3340-421c-937a-d6e35e752bdc\") " pod="metallb-system/controller-6968d8fdc4-9qzfc" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.232219 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f824bf65-570f-4d47-8006-8e13fb86368f-frr-conf\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.232393 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f824bf65-570f-4d47-8006-8e13fb86368f-frr-sockets\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.232451 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f824bf65-570f-4d47-8006-8e13fb86368f-metrics\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.232652 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f824bf65-570f-4d47-8006-8e13fb86368f-reloader\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.233139 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f824bf65-570f-4d47-8006-8e13fb86368f-frr-startup\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.237113 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f824bf65-570f-4d47-8006-8e13fb86368f-metrics-certs\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.250452 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwh6s\" (UniqueName: \"kubernetes.io/projected/a349654d-030c-4341-b884-8f295ea9dfa9-kube-api-access-xwh6s\") pod \"frr-k8s-webhook-server-7df86c4f6c-mbw9s\" (UID: \"a349654d-030c-4341-b884-8f295ea9dfa9\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-mbw9s" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.250878 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48rcn\" (UniqueName: \"kubernetes.io/projected/f824bf65-570f-4d47-8006-8e13fb86368f-kube-api-access-48rcn\") pod \"frr-k8s-g4bxr\" (UID: \"f824bf65-570f-4d47-8006-8e13fb86368f\") " pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.333237 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rg2w\" (UniqueName: \"kubernetes.io/projected/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-kube-api-access-2rg2w\") pod \"speaker-mjlbx\" (UID: \"6f2ff038-c715-4cff-a872-ac6ae5c7fbff\") " pod="metallb-system/speaker-mjlbx" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.333298 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-memberlist\") pod \"speaker-mjlbx\" (UID: \"6f2ff038-c715-4cff-a872-ac6ae5c7fbff\") " pod="metallb-system/speaker-mjlbx" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.333343 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-metallb-excludel2\") pod \"speaker-mjlbx\" (UID: \"6f2ff038-c715-4cff-a872-ac6ae5c7fbff\") " pod="metallb-system/speaker-mjlbx" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.333374 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-metrics-certs\") pod \"speaker-mjlbx\" (UID: \"6f2ff038-c715-4cff-a872-ac6ae5c7fbff\") " pod="metallb-system/speaker-mjlbx" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.333402 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/896d6757-3340-421c-937a-d6e35e752bdc-metrics-certs\") pod \"controller-6968d8fdc4-9qzfc\" (UID: \"896d6757-3340-421c-937a-d6e35e752bdc\") " pod="metallb-system/controller-6968d8fdc4-9qzfc" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.333438 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/896d6757-3340-421c-937a-d6e35e752bdc-cert\") pod \"controller-6968d8fdc4-9qzfc\" (UID: \"896d6757-3340-421c-937a-d6e35e752bdc\") " pod="metallb-system/controller-6968d8fdc4-9qzfc" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.333471 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcvck\" (UniqueName: \"kubernetes.io/projected/896d6757-3340-421c-937a-d6e35e752bdc-kube-api-access-tcvck\") pod \"controller-6968d8fdc4-9qzfc\" (UID: \"896d6757-3340-421c-937a-d6e35e752bdc\") " pod="metallb-system/controller-6968d8fdc4-9qzfc" Jan 28 07:03:35 crc kubenswrapper[4776]: E0128 07:03:35.333478 4776 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 28 07:03:35 crc kubenswrapper[4776]: E0128 07:03:35.333736 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-memberlist podName:6f2ff038-c715-4cff-a872-ac6ae5c7fbff nodeName:}" failed. No retries permitted until 2026-01-28 07:03:35.833715731 +0000 UTC m=+787.249375901 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-memberlist") pod "speaker-mjlbx" (UID: "6f2ff038-c715-4cff-a872-ac6ae5c7fbff") : secret "metallb-memberlist" not found Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.334773 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-metallb-excludel2\") pod \"speaker-mjlbx\" (UID: \"6f2ff038-c715-4cff-a872-ac6ae5c7fbff\") " pod="metallb-system/speaker-mjlbx" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.336541 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.337401 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/896d6757-3340-421c-937a-d6e35e752bdc-metrics-certs\") pod \"controller-6968d8fdc4-9qzfc\" (UID: \"896d6757-3340-421c-937a-d6e35e752bdc\") " pod="metallb-system/controller-6968d8fdc4-9qzfc" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.337555 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-metrics-certs\") pod \"speaker-mjlbx\" (UID: \"6f2ff038-c715-4cff-a872-ac6ae5c7fbff\") " pod="metallb-system/speaker-mjlbx" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.348162 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/896d6757-3340-421c-937a-d6e35e752bdc-cert\") pod \"controller-6968d8fdc4-9qzfc\" (UID: \"896d6757-3340-421c-937a-d6e35e752bdc\") " pod="metallb-system/controller-6968d8fdc4-9qzfc" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.350762 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcvck\" (UniqueName: \"kubernetes.io/projected/896d6757-3340-421c-937a-d6e35e752bdc-kube-api-access-tcvck\") pod \"controller-6968d8fdc4-9qzfc\" (UID: \"896d6757-3340-421c-937a-d6e35e752bdc\") " pod="metallb-system/controller-6968d8fdc4-9qzfc" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.351131 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.356178 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rg2w\" (UniqueName: \"kubernetes.io/projected/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-kube-api-access-2rg2w\") pod \"speaker-mjlbx\" (UID: \"6f2ff038-c715-4cff-a872-ac6ae5c7fbff\") " pod="metallb-system/speaker-mjlbx" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.433447 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-9qzfc" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.525975 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4bxr" event={"ID":"f824bf65-570f-4d47-8006-8e13fb86368f","Type":"ContainerStarted","Data":"e1f570d26fe6cccb8a2825c632c894f3f7bfc48faaa08c1a04977446e36483dc"} Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.738195 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a349654d-030c-4341-b884-8f295ea9dfa9-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-mbw9s\" (UID: \"a349654d-030c-4341-b884-8f295ea9dfa9\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-mbw9s" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.745356 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a349654d-030c-4341-b884-8f295ea9dfa9-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-mbw9s\" (UID: \"a349654d-030c-4341-b884-8f295ea9dfa9\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-mbw9s" Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.840035 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-memberlist\") pod \"speaker-mjlbx\" (UID: \"6f2ff038-c715-4cff-a872-ac6ae5c7fbff\") " pod="metallb-system/speaker-mjlbx" Jan 28 07:03:35 crc kubenswrapper[4776]: E0128 07:03:35.840370 4776 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 28 07:03:35 crc kubenswrapper[4776]: E0128 07:03:35.840466 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-memberlist podName:6f2ff038-c715-4cff-a872-ac6ae5c7fbff nodeName:}" failed. No retries permitted until 2026-01-28 07:03:36.840436456 +0000 UTC m=+788.256096646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-memberlist") pod "speaker-mjlbx" (UID: "6f2ff038-c715-4cff-a872-ac6ae5c7fbff") : secret "metallb-memberlist" not found Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.848721 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-9qzfc"] Jan 28 07:03:35 crc kubenswrapper[4776]: W0128 07:03:35.868730 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod896d6757_3340_421c_937a_d6e35e752bdc.slice/crio-0e0dfb7aae83b8b5f41fc605c91656672805fdf336ee88f1ca0dc2f1d73ff003 WatchSource:0}: Error finding container 0e0dfb7aae83b8b5f41fc605c91656672805fdf336ee88f1ca0dc2f1d73ff003: Status 404 returned error can't find the container with id 0e0dfb7aae83b8b5f41fc605c91656672805fdf336ee88f1ca0dc2f1d73ff003 Jan 28 07:03:35 crc kubenswrapper[4776]: I0128 07:03:35.939531 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-mbw9s" Jan 28 07:03:36 crc kubenswrapper[4776]: I0128 07:03:36.439738 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-mbw9s"] Jan 28 07:03:36 crc kubenswrapper[4776]: W0128 07:03:36.446979 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda349654d_030c_4341_b884_8f295ea9dfa9.slice/crio-0e96e69dbd130557a16c0570c092b85ddc7f3983e6e5d50812b1b59bfa5fc9e5 WatchSource:0}: Error finding container 0e96e69dbd130557a16c0570c092b85ddc7f3983e6e5d50812b1b59bfa5fc9e5: Status 404 returned error can't find the container with id 0e96e69dbd130557a16c0570c092b85ddc7f3983e6e5d50812b1b59bfa5fc9e5 Jan 28 07:03:36 crc kubenswrapper[4776]: I0128 07:03:36.533149 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-9qzfc" event={"ID":"896d6757-3340-421c-937a-d6e35e752bdc","Type":"ContainerStarted","Data":"47f886d53ae21dcfd95632d8768b0fd2abdd4515b838ec10c001db48b45b0649"} Jan 28 07:03:36 crc kubenswrapper[4776]: I0128 07:03:36.533219 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-9qzfc" event={"ID":"896d6757-3340-421c-937a-d6e35e752bdc","Type":"ContainerStarted","Data":"5b85d854c49c54be09d7a5f56b73f6b1b967f87d8a1fd53871769372f23c2169"} Jan 28 07:03:36 crc kubenswrapper[4776]: I0128 07:03:36.533234 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-9qzfc" event={"ID":"896d6757-3340-421c-937a-d6e35e752bdc","Type":"ContainerStarted","Data":"0e0dfb7aae83b8b5f41fc605c91656672805fdf336ee88f1ca0dc2f1d73ff003"} Jan 28 07:03:36 crc kubenswrapper[4776]: I0128 07:03:36.533293 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-9qzfc" Jan 28 07:03:36 crc kubenswrapper[4776]: I0128 07:03:36.534119 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-mbw9s" event={"ID":"a349654d-030c-4341-b884-8f295ea9dfa9","Type":"ContainerStarted","Data":"0e96e69dbd130557a16c0570c092b85ddc7f3983e6e5d50812b1b59bfa5fc9e5"} Jan 28 07:03:36 crc kubenswrapper[4776]: I0128 07:03:36.859150 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-memberlist\") pod \"speaker-mjlbx\" (UID: \"6f2ff038-c715-4cff-a872-ac6ae5c7fbff\") " pod="metallb-system/speaker-mjlbx" Jan 28 07:03:36 crc kubenswrapper[4776]: I0128 07:03:36.865183 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6f2ff038-c715-4cff-a872-ac6ae5c7fbff-memberlist\") pod \"speaker-mjlbx\" (UID: \"6f2ff038-c715-4cff-a872-ac6ae5c7fbff\") " pod="metallb-system/speaker-mjlbx" Jan 28 07:03:36 crc kubenswrapper[4776]: I0128 07:03:36.914965 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mjlbx" Jan 28 07:03:37 crc kubenswrapper[4776]: I0128 07:03:37.544068 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mjlbx" event={"ID":"6f2ff038-c715-4cff-a872-ac6ae5c7fbff","Type":"ContainerStarted","Data":"9bdcdf0f12f81317678dec15a04534de1da37cf70da269cd8a8d6953881a09dc"} Jan 28 07:03:37 crc kubenswrapper[4776]: I0128 07:03:37.544103 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mjlbx" event={"ID":"6f2ff038-c715-4cff-a872-ac6ae5c7fbff","Type":"ContainerStarted","Data":"bee89318ad1d93b389c92a4ea4010e7203b2db3f2348b98a2119b88c4ae700c5"} Jan 28 07:03:37 crc kubenswrapper[4776]: I0128 07:03:37.544116 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mjlbx" event={"ID":"6f2ff038-c715-4cff-a872-ac6ae5c7fbff","Type":"ContainerStarted","Data":"8608176589ce5b2d990ded8ced839e44d03774c1c218b6df89234f6770ebc580"} Jan 28 07:03:37 crc kubenswrapper[4776]: I0128 07:03:37.544857 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-mjlbx" Jan 28 07:03:37 crc kubenswrapper[4776]: I0128 07:03:37.569147 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-mjlbx" podStartSLOduration=2.56912575 podStartE2EDuration="2.56912575s" podCreationTimestamp="2026-01-28 07:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:03:37.567171447 +0000 UTC m=+788.982831607" watchObservedRunningTime="2026-01-28 07:03:37.56912575 +0000 UTC m=+788.984785910" Jan 28 07:03:37 crc kubenswrapper[4776]: I0128 07:03:37.570672 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-9qzfc" podStartSLOduration=2.570664443 podStartE2EDuration="2.570664443s" podCreationTimestamp="2026-01-28 07:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:03:36.556992777 +0000 UTC m=+787.972652937" watchObservedRunningTime="2026-01-28 07:03:37.570664443 +0000 UTC m=+788.986324603" Jan 28 07:03:43 crc kubenswrapper[4776]: I0128 07:03:43.591419 4776 generic.go:334] "Generic (PLEG): container finished" podID="f824bf65-570f-4d47-8006-8e13fb86368f" containerID="9e67172738cfb2c6feb6587b1cb5a0f96e7fa6b6703e3b1e9f19355e444cd55c" exitCode=0 Jan 28 07:03:43 crc kubenswrapper[4776]: I0128 07:03:43.591476 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4bxr" event={"ID":"f824bf65-570f-4d47-8006-8e13fb86368f","Type":"ContainerDied","Data":"9e67172738cfb2c6feb6587b1cb5a0f96e7fa6b6703e3b1e9f19355e444cd55c"} Jan 28 07:03:43 crc kubenswrapper[4776]: I0128 07:03:43.593770 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-mbw9s" event={"ID":"a349654d-030c-4341-b884-8f295ea9dfa9","Type":"ContainerStarted","Data":"2ffd74da9c77cd2bdf087ebc95c4d0e83898510402f3a46321b5f71b96fec2d1"} Jan 28 07:03:43 crc kubenswrapper[4776]: I0128 07:03:43.594639 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-mbw9s" Jan 28 07:03:43 crc kubenswrapper[4776]: I0128 07:03:43.649918 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-mbw9s" podStartSLOduration=3.308074766 podStartE2EDuration="9.64989141s" podCreationTimestamp="2026-01-28 07:03:34 +0000 UTC" firstStartedPulling="2026-01-28 07:03:36.453513603 +0000 UTC m=+787.869173763" lastFinishedPulling="2026-01-28 07:03:42.795330237 +0000 UTC m=+794.210990407" observedRunningTime="2026-01-28 07:03:43.641449668 +0000 UTC m=+795.057109828" watchObservedRunningTime="2026-01-28 07:03:43.64989141 +0000 UTC m=+795.065551600" Jan 28 07:03:44 crc kubenswrapper[4776]: I0128 07:03:44.604827 4776 generic.go:334] "Generic (PLEG): container finished" podID="f824bf65-570f-4d47-8006-8e13fb86368f" containerID="986e60fb9ebea988043ef091ff91e5bc30cb10fda1e9166eacb7dff71703afdb" exitCode=0 Jan 28 07:03:44 crc kubenswrapper[4776]: I0128 07:03:44.604868 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4bxr" event={"ID":"f824bf65-570f-4d47-8006-8e13fb86368f","Type":"ContainerDied","Data":"986e60fb9ebea988043ef091ff91e5bc30cb10fda1e9166eacb7dff71703afdb"} Jan 28 07:03:45 crc kubenswrapper[4776]: I0128 07:03:45.616856 4776 generic.go:334] "Generic (PLEG): container finished" podID="f824bf65-570f-4d47-8006-8e13fb86368f" containerID="5b9527126338fb6d60c9fee64d7ab5f6d2c93bbbd4380522a452c400c15a541e" exitCode=0 Jan 28 07:03:45 crc kubenswrapper[4776]: I0128 07:03:45.616929 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4bxr" event={"ID":"f824bf65-570f-4d47-8006-8e13fb86368f","Type":"ContainerDied","Data":"5b9527126338fb6d60c9fee64d7ab5f6d2c93bbbd4380522a452c400c15a541e"} Jan 28 07:03:46 crc kubenswrapper[4776]: I0128 07:03:46.634175 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4bxr" event={"ID":"f824bf65-570f-4d47-8006-8e13fb86368f","Type":"ContainerStarted","Data":"53c373b38ed75744448f94bf46296a2a4b0f96700f3f526bb91ddd958e733e94"} Jan 28 07:03:46 crc kubenswrapper[4776]: I0128 07:03:46.634662 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4bxr" event={"ID":"f824bf65-570f-4d47-8006-8e13fb86368f","Type":"ContainerStarted","Data":"479feef449303af5cef30642fac6d17b6f0de69ccd0b4c4e0dcea419c88b3025"} Jan 28 07:03:46 crc kubenswrapper[4776]: I0128 07:03:46.634693 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4bxr" event={"ID":"f824bf65-570f-4d47-8006-8e13fb86368f","Type":"ContainerStarted","Data":"e9e35842ad3d60d6b70e9566ab5ca907c9f7562bba2472fb5fa258f28c59f3e0"} Jan 28 07:03:46 crc kubenswrapper[4776]: I0128 07:03:46.634717 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4bxr" event={"ID":"f824bf65-570f-4d47-8006-8e13fb86368f","Type":"ContainerStarted","Data":"0833dd2649219d7c47162283ce173eac9634d6c0f89e79338dd2f6a5a309766d"} Jan 28 07:03:46 crc kubenswrapper[4776]: I0128 07:03:46.634740 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4bxr" event={"ID":"f824bf65-570f-4d47-8006-8e13fb86368f","Type":"ContainerStarted","Data":"6b5efe406a38c12a61afcc1a7dec12658b68f413e8117b5f899d253a897d0d48"} Jan 28 07:03:47 crc kubenswrapper[4776]: I0128 07:03:47.649498 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4bxr" event={"ID":"f824bf65-570f-4d47-8006-8e13fb86368f","Type":"ContainerStarted","Data":"4325fb3362334b59ff1b799ada82b4de14a421b274d40dac6027dd47aa3e137c"} Jan 28 07:03:47 crc kubenswrapper[4776]: I0128 07:03:47.650087 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:47 crc kubenswrapper[4776]: I0128 07:03:47.698525 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-g4bxr" podStartSLOduration=6.413384933 podStartE2EDuration="13.698504567s" podCreationTimestamp="2026-01-28 07:03:34 +0000 UTC" firstStartedPulling="2026-01-28 07:03:35.502158891 +0000 UTC m=+786.917819061" lastFinishedPulling="2026-01-28 07:03:42.787278525 +0000 UTC m=+794.202938695" observedRunningTime="2026-01-28 07:03:47.692440051 +0000 UTC m=+799.108100281" watchObservedRunningTime="2026-01-28 07:03:47.698504567 +0000 UTC m=+799.114164737" Jan 28 07:03:50 crc kubenswrapper[4776]: I0128 07:03:50.352519 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:50 crc kubenswrapper[4776]: I0128 07:03:50.399862 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:55 crc kubenswrapper[4776]: I0128 07:03:55.354633 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-g4bxr" Jan 28 07:03:55 crc kubenswrapper[4776]: I0128 07:03:55.440525 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-9qzfc" Jan 28 07:03:55 crc kubenswrapper[4776]: I0128 07:03:55.944005 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-mbw9s" Jan 28 07:03:56 crc kubenswrapper[4776]: I0128 07:03:56.921246 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-mjlbx" Jan 28 07:03:58 crc kubenswrapper[4776]: I0128 07:03:58.850136 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-568sw"] Jan 28 07:03:58 crc kubenswrapper[4776]: I0128 07:03:58.852263 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-568sw" Jan 28 07:03:58 crc kubenswrapper[4776]: I0128 07:03:58.887900 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-568sw"] Jan 28 07:03:58 crc kubenswrapper[4776]: I0128 07:03:58.915772 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljhlw\" (UniqueName: \"kubernetes.io/projected/baa6487d-4836-485f-91b2-cbb55dc70030-kube-api-access-ljhlw\") pod \"certified-operators-568sw\" (UID: \"baa6487d-4836-485f-91b2-cbb55dc70030\") " pod="openshift-marketplace/certified-operators-568sw" Jan 28 07:03:58 crc kubenswrapper[4776]: I0128 07:03:58.916095 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa6487d-4836-485f-91b2-cbb55dc70030-utilities\") pod \"certified-operators-568sw\" (UID: \"baa6487d-4836-485f-91b2-cbb55dc70030\") " pod="openshift-marketplace/certified-operators-568sw" Jan 28 07:03:58 crc kubenswrapper[4776]: I0128 07:03:58.916321 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa6487d-4836-485f-91b2-cbb55dc70030-catalog-content\") pod \"certified-operators-568sw\" (UID: \"baa6487d-4836-485f-91b2-cbb55dc70030\") " pod="openshift-marketplace/certified-operators-568sw" Jan 28 07:03:59 crc kubenswrapper[4776]: I0128 07:03:59.017417 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa6487d-4836-485f-91b2-cbb55dc70030-catalog-content\") pod \"certified-operators-568sw\" (UID: \"baa6487d-4836-485f-91b2-cbb55dc70030\") " pod="openshift-marketplace/certified-operators-568sw" Jan 28 07:03:59 crc kubenswrapper[4776]: I0128 07:03:59.017508 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljhlw\" (UniqueName: \"kubernetes.io/projected/baa6487d-4836-485f-91b2-cbb55dc70030-kube-api-access-ljhlw\") pod \"certified-operators-568sw\" (UID: \"baa6487d-4836-485f-91b2-cbb55dc70030\") " pod="openshift-marketplace/certified-operators-568sw" Jan 28 07:03:59 crc kubenswrapper[4776]: I0128 07:03:59.017536 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa6487d-4836-485f-91b2-cbb55dc70030-utilities\") pod \"certified-operators-568sw\" (UID: \"baa6487d-4836-485f-91b2-cbb55dc70030\") " pod="openshift-marketplace/certified-operators-568sw" Jan 28 07:03:59 crc kubenswrapper[4776]: I0128 07:03:59.018091 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa6487d-4836-485f-91b2-cbb55dc70030-catalog-content\") pod \"certified-operators-568sw\" (UID: \"baa6487d-4836-485f-91b2-cbb55dc70030\") " pod="openshift-marketplace/certified-operators-568sw" Jan 28 07:03:59 crc kubenswrapper[4776]: I0128 07:03:59.018185 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa6487d-4836-485f-91b2-cbb55dc70030-utilities\") pod \"certified-operators-568sw\" (UID: \"baa6487d-4836-485f-91b2-cbb55dc70030\") " pod="openshift-marketplace/certified-operators-568sw" Jan 28 07:03:59 crc kubenswrapper[4776]: I0128 07:03:59.055875 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljhlw\" (UniqueName: \"kubernetes.io/projected/baa6487d-4836-485f-91b2-cbb55dc70030-kube-api-access-ljhlw\") pod \"certified-operators-568sw\" (UID: \"baa6487d-4836-485f-91b2-cbb55dc70030\") " pod="openshift-marketplace/certified-operators-568sw" Jan 28 07:03:59 crc kubenswrapper[4776]: I0128 07:03:59.183397 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-568sw" Jan 28 07:03:59 crc kubenswrapper[4776]: I0128 07:03:59.432335 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-568sw"] Jan 28 07:03:59 crc kubenswrapper[4776]: W0128 07:03:59.436093 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaa6487d_4836_485f_91b2_cbb55dc70030.slice/crio-b66f07bfbccb6ad36f1284f3031dbde4ff8bc6b367599a06612e9d6108f2c384 WatchSource:0}: Error finding container b66f07bfbccb6ad36f1284f3031dbde4ff8bc6b367599a06612e9d6108f2c384: Status 404 returned error can't find the container with id b66f07bfbccb6ad36f1284f3031dbde4ff8bc6b367599a06612e9d6108f2c384 Jan 28 07:03:59 crc kubenswrapper[4776]: I0128 07:03:59.745525 4776 generic.go:334] "Generic (PLEG): container finished" podID="baa6487d-4836-485f-91b2-cbb55dc70030" containerID="1bf2c25d96bfb443c0d64a331ff147fd56e0a1e88b358ecb63ce83150c4fa11d" exitCode=0 Jan 28 07:03:59 crc kubenswrapper[4776]: I0128 07:03:59.745760 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-568sw" event={"ID":"baa6487d-4836-485f-91b2-cbb55dc70030","Type":"ContainerDied","Data":"1bf2c25d96bfb443c0d64a331ff147fd56e0a1e88b358ecb63ce83150c4fa11d"} Jan 28 07:03:59 crc kubenswrapper[4776]: I0128 07:03:59.745843 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-568sw" event={"ID":"baa6487d-4836-485f-91b2-cbb55dc70030","Type":"ContainerStarted","Data":"b66f07bfbccb6ad36f1284f3031dbde4ff8bc6b367599a06612e9d6108f2c384"} Jan 28 07:04:00 crc kubenswrapper[4776]: I0128 07:04:00.753258 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-568sw" event={"ID":"baa6487d-4836-485f-91b2-cbb55dc70030","Type":"ContainerStarted","Data":"d5f02d16d9d027560160966c6bf65cd53e40009aab9b6b04f1102162b4ff7302"} Jan 28 07:04:01 crc kubenswrapper[4776]: I0128 07:04:01.761220 4776 generic.go:334] "Generic (PLEG): container finished" podID="baa6487d-4836-485f-91b2-cbb55dc70030" containerID="d5f02d16d9d027560160966c6bf65cd53e40009aab9b6b04f1102162b4ff7302" exitCode=0 Jan 28 07:04:01 crc kubenswrapper[4776]: I0128 07:04:01.761279 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-568sw" event={"ID":"baa6487d-4836-485f-91b2-cbb55dc70030","Type":"ContainerDied","Data":"d5f02d16d9d027560160966c6bf65cd53e40009aab9b6b04f1102162b4ff7302"} Jan 28 07:04:03 crc kubenswrapper[4776]: I0128 07:04:03.126863 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-njlm4"] Jan 28 07:04:03 crc kubenswrapper[4776]: I0128 07:04:03.128817 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-njlm4" Jan 28 07:04:03 crc kubenswrapper[4776]: I0128 07:04:03.135154 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 28 07:04:03 crc kubenswrapper[4776]: I0128 07:04:03.135336 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-j2mzx" Jan 28 07:04:03 crc kubenswrapper[4776]: I0128 07:04:03.135489 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-njlm4"] Jan 28 07:04:03 crc kubenswrapper[4776]: I0128 07:04:03.136273 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 28 07:04:03 crc kubenswrapper[4776]: I0128 07:04:03.139390 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-568sw" event={"ID":"baa6487d-4836-485f-91b2-cbb55dc70030","Type":"ContainerStarted","Data":"26c1327c61fa8f8e69913397bbcb81df98863c71b4e09664fca2eceb40888e59"} Jan 28 07:04:03 crc kubenswrapper[4776]: I0128 07:04:03.176886 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-568sw" podStartSLOduration=2.727124394 podStartE2EDuration="5.176869743s" podCreationTimestamp="2026-01-28 07:03:58 +0000 UTC" firstStartedPulling="2026-01-28 07:03:59.747255657 +0000 UTC m=+811.162915817" lastFinishedPulling="2026-01-28 07:04:02.197001006 +0000 UTC m=+813.612661166" observedRunningTime="2026-01-28 07:04:03.172924964 +0000 UTC m=+814.588585164" watchObservedRunningTime="2026-01-28 07:04:03.176869743 +0000 UTC m=+814.592529903" Jan 28 07:04:03 crc kubenswrapper[4776]: I0128 07:04:03.324373 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z29gj\" (UniqueName: \"kubernetes.io/projected/ce3e5ab9-01db-487b-9176-60b655f03b9b-kube-api-access-z29gj\") pod \"openstack-operator-index-njlm4\" (UID: \"ce3e5ab9-01db-487b-9176-60b655f03b9b\") " pod="openstack-operators/openstack-operator-index-njlm4" Jan 28 07:04:03 crc kubenswrapper[4776]: I0128 07:04:03.426010 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z29gj\" (UniqueName: \"kubernetes.io/projected/ce3e5ab9-01db-487b-9176-60b655f03b9b-kube-api-access-z29gj\") pod \"openstack-operator-index-njlm4\" (UID: \"ce3e5ab9-01db-487b-9176-60b655f03b9b\") " pod="openstack-operators/openstack-operator-index-njlm4" Jan 28 07:04:03 crc kubenswrapper[4776]: I0128 07:04:03.455122 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z29gj\" (UniqueName: \"kubernetes.io/projected/ce3e5ab9-01db-487b-9176-60b655f03b9b-kube-api-access-z29gj\") pod \"openstack-operator-index-njlm4\" (UID: \"ce3e5ab9-01db-487b-9176-60b655f03b9b\") " pod="openstack-operators/openstack-operator-index-njlm4" Jan 28 07:04:03 crc kubenswrapper[4776]: I0128 07:04:03.458986 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-njlm4" Jan 28 07:04:03 crc kubenswrapper[4776]: I0128 07:04:03.851682 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:04:03 crc kubenswrapper[4776]: I0128 07:04:03.852043 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:04:03 crc kubenswrapper[4776]: I0128 07:04:03.889088 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-njlm4"] Jan 28 07:04:04 crc kubenswrapper[4776]: I0128 07:04:04.148441 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-njlm4" event={"ID":"ce3e5ab9-01db-487b-9176-60b655f03b9b","Type":"ContainerStarted","Data":"936db72d3cd72d8cfdad76987b0567aa12111495fef269f5ae58041371aef171"} Jan 28 07:04:07 crc kubenswrapper[4776]: I0128 07:04:07.175281 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-njlm4" event={"ID":"ce3e5ab9-01db-487b-9176-60b655f03b9b","Type":"ContainerStarted","Data":"c0c6cd079c3d6d3ba81b6ff5b6d81e4fcc374e77bf1922b184bbf187ad690f9e"} Jan 28 07:04:07 crc kubenswrapper[4776]: I0128 07:04:07.189776 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-njlm4" podStartSLOduration=3.052033791 podStartE2EDuration="5.189755306s" podCreationTimestamp="2026-01-28 07:04:02 +0000 UTC" firstStartedPulling="2026-01-28 07:04:03.899963343 +0000 UTC m=+815.315623513" lastFinishedPulling="2026-01-28 07:04:06.037684868 +0000 UTC m=+817.453345028" observedRunningTime="2026-01-28 07:04:07.189392777 +0000 UTC m=+818.605052967" watchObservedRunningTime="2026-01-28 07:04:07.189755306 +0000 UTC m=+818.605415496" Jan 28 07:04:09 crc kubenswrapper[4776]: I0128 07:04:09.184133 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-568sw" Jan 28 07:04:09 crc kubenswrapper[4776]: I0128 07:04:09.184446 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-568sw" Jan 28 07:04:09 crc kubenswrapper[4776]: I0128 07:04:09.213277 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jn6pq"] Jan 28 07:04:09 crc kubenswrapper[4776]: I0128 07:04:09.214877 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jn6pq" Jan 28 07:04:09 crc kubenswrapper[4776]: I0128 07:04:09.239235 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jn6pq"] Jan 28 07:04:09 crc kubenswrapper[4776]: I0128 07:04:09.270435 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-568sw" Jan 28 07:04:09 crc kubenswrapper[4776]: I0128 07:04:09.351333 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a7362c-68b7-4443-98c0-0d79dbf5b912-utilities\") pod \"community-operators-jn6pq\" (UID: \"55a7362c-68b7-4443-98c0-0d79dbf5b912\") " pod="openshift-marketplace/community-operators-jn6pq" Jan 28 07:04:09 crc kubenswrapper[4776]: I0128 07:04:09.351430 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a7362c-68b7-4443-98c0-0d79dbf5b912-catalog-content\") pod \"community-operators-jn6pq\" (UID: \"55a7362c-68b7-4443-98c0-0d79dbf5b912\") " pod="openshift-marketplace/community-operators-jn6pq" Jan 28 07:04:09 crc kubenswrapper[4776]: I0128 07:04:09.351731 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j757\" (UniqueName: \"kubernetes.io/projected/55a7362c-68b7-4443-98c0-0d79dbf5b912-kube-api-access-6j757\") pod \"community-operators-jn6pq\" (UID: \"55a7362c-68b7-4443-98c0-0d79dbf5b912\") " pod="openshift-marketplace/community-operators-jn6pq" Jan 28 07:04:09 crc kubenswrapper[4776]: I0128 07:04:09.453041 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j757\" (UniqueName: \"kubernetes.io/projected/55a7362c-68b7-4443-98c0-0d79dbf5b912-kube-api-access-6j757\") pod \"community-operators-jn6pq\" (UID: \"55a7362c-68b7-4443-98c0-0d79dbf5b912\") " pod="openshift-marketplace/community-operators-jn6pq" Jan 28 07:04:09 crc kubenswrapper[4776]: I0128 07:04:09.453222 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a7362c-68b7-4443-98c0-0d79dbf5b912-utilities\") pod \"community-operators-jn6pq\" (UID: \"55a7362c-68b7-4443-98c0-0d79dbf5b912\") " pod="openshift-marketplace/community-operators-jn6pq" Jan 28 07:04:09 crc kubenswrapper[4776]: I0128 07:04:09.453288 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a7362c-68b7-4443-98c0-0d79dbf5b912-catalog-content\") pod \"community-operators-jn6pq\" (UID: \"55a7362c-68b7-4443-98c0-0d79dbf5b912\") " pod="openshift-marketplace/community-operators-jn6pq" Jan 28 07:04:09 crc kubenswrapper[4776]: I0128 07:04:09.453912 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a7362c-68b7-4443-98c0-0d79dbf5b912-utilities\") pod \"community-operators-jn6pq\" (UID: \"55a7362c-68b7-4443-98c0-0d79dbf5b912\") " pod="openshift-marketplace/community-operators-jn6pq" Jan 28 07:04:09 crc kubenswrapper[4776]: I0128 07:04:09.454198 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a7362c-68b7-4443-98c0-0d79dbf5b912-catalog-content\") pod \"community-operators-jn6pq\" (UID: \"55a7362c-68b7-4443-98c0-0d79dbf5b912\") " pod="openshift-marketplace/community-operators-jn6pq" Jan 28 07:04:09 crc kubenswrapper[4776]: I0128 07:04:09.472774 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j757\" (UniqueName: \"kubernetes.io/projected/55a7362c-68b7-4443-98c0-0d79dbf5b912-kube-api-access-6j757\") pod \"community-operators-jn6pq\" (UID: \"55a7362c-68b7-4443-98c0-0d79dbf5b912\") " pod="openshift-marketplace/community-operators-jn6pq" Jan 28 07:04:09 crc kubenswrapper[4776]: I0128 07:04:09.538810 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jn6pq" Jan 28 07:04:10 crc kubenswrapper[4776]: I0128 07:04:10.076469 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jn6pq"] Jan 28 07:04:10 crc kubenswrapper[4776]: W0128 07:04:10.086806 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55a7362c_68b7_4443_98c0_0d79dbf5b912.slice/crio-60fcbb4a6c090f6716fa669b19ed2554aa4e8cce303a6641918e89a586a5408f WatchSource:0}: Error finding container 60fcbb4a6c090f6716fa669b19ed2554aa4e8cce303a6641918e89a586a5408f: Status 404 returned error can't find the container with id 60fcbb4a6c090f6716fa669b19ed2554aa4e8cce303a6641918e89a586a5408f Jan 28 07:04:10 crc kubenswrapper[4776]: I0128 07:04:10.197009 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jn6pq" event={"ID":"55a7362c-68b7-4443-98c0-0d79dbf5b912","Type":"ContainerStarted","Data":"60fcbb4a6c090f6716fa669b19ed2554aa4e8cce303a6641918e89a586a5408f"} Jan 28 07:04:10 crc kubenswrapper[4776]: I0128 07:04:10.257718 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-568sw" Jan 28 07:04:11 crc kubenswrapper[4776]: I0128 07:04:11.207731 4776 generic.go:334] "Generic (PLEG): container finished" podID="55a7362c-68b7-4443-98c0-0d79dbf5b912" containerID="02ebc6561655868aa55bcacdbffac65effca7d7e9352f0842cdcab9f204883f0" exitCode=0 Jan 28 07:04:11 crc kubenswrapper[4776]: I0128 07:04:11.207895 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jn6pq" event={"ID":"55a7362c-68b7-4443-98c0-0d79dbf5b912","Type":"ContainerDied","Data":"02ebc6561655868aa55bcacdbffac65effca7d7e9352f0842cdcab9f204883f0"} Jan 28 07:04:12 crc kubenswrapper[4776]: I0128 07:04:12.813192 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-568sw"] Jan 28 07:04:12 crc kubenswrapper[4776]: I0128 07:04:12.814099 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-568sw" podUID="baa6487d-4836-485f-91b2-cbb55dc70030" containerName="registry-server" containerID="cri-o://26c1327c61fa8f8e69913397bbcb81df98863c71b4e09664fca2eceb40888e59" gracePeriod=2 Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.223821 4776 generic.go:334] "Generic (PLEG): container finished" podID="55a7362c-68b7-4443-98c0-0d79dbf5b912" containerID="bb90850db61df6d7b8146f26a41b16da5ae1a1757af68fc1d18b8ab36129c74c" exitCode=0 Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.225055 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jn6pq" event={"ID":"55a7362c-68b7-4443-98c0-0d79dbf5b912","Type":"ContainerDied","Data":"bb90850db61df6d7b8146f26a41b16da5ae1a1757af68fc1d18b8ab36129c74c"} Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.229166 4776 generic.go:334] "Generic (PLEG): container finished" podID="baa6487d-4836-485f-91b2-cbb55dc70030" containerID="26c1327c61fa8f8e69913397bbcb81df98863c71b4e09664fca2eceb40888e59" exitCode=0 Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.229211 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-568sw" event={"ID":"baa6487d-4836-485f-91b2-cbb55dc70030","Type":"ContainerDied","Data":"26c1327c61fa8f8e69913397bbcb81df98863c71b4e09664fca2eceb40888e59"} Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.229241 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-568sw" event={"ID":"baa6487d-4836-485f-91b2-cbb55dc70030","Type":"ContainerDied","Data":"b66f07bfbccb6ad36f1284f3031dbde4ff8bc6b367599a06612e9d6108f2c384"} Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.229258 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b66f07bfbccb6ad36f1284f3031dbde4ff8bc6b367599a06612e9d6108f2c384" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.280672 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-568sw" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.418531 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljhlw\" (UniqueName: \"kubernetes.io/projected/baa6487d-4836-485f-91b2-cbb55dc70030-kube-api-access-ljhlw\") pod \"baa6487d-4836-485f-91b2-cbb55dc70030\" (UID: \"baa6487d-4836-485f-91b2-cbb55dc70030\") " Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.418675 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa6487d-4836-485f-91b2-cbb55dc70030-catalog-content\") pod \"baa6487d-4836-485f-91b2-cbb55dc70030\" (UID: \"baa6487d-4836-485f-91b2-cbb55dc70030\") " Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.418871 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa6487d-4836-485f-91b2-cbb55dc70030-utilities\") pod \"baa6487d-4836-485f-91b2-cbb55dc70030\" (UID: \"baa6487d-4836-485f-91b2-cbb55dc70030\") " Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.420269 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baa6487d-4836-485f-91b2-cbb55dc70030-utilities" (OuterVolumeSpecName: "utilities") pod "baa6487d-4836-485f-91b2-cbb55dc70030" (UID: "baa6487d-4836-485f-91b2-cbb55dc70030"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.425129 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baa6487d-4836-485f-91b2-cbb55dc70030-kube-api-access-ljhlw" (OuterVolumeSpecName: "kube-api-access-ljhlw") pod "baa6487d-4836-485f-91b2-cbb55dc70030" (UID: "baa6487d-4836-485f-91b2-cbb55dc70030"). InnerVolumeSpecName "kube-api-access-ljhlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.460071 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-njlm4" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.460129 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-njlm4" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.500051 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-njlm4" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.503383 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baa6487d-4836-485f-91b2-cbb55dc70030-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "baa6487d-4836-485f-91b2-cbb55dc70030" (UID: "baa6487d-4836-485f-91b2-cbb55dc70030"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.521623 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa6487d-4836-485f-91b2-cbb55dc70030-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.521679 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljhlw\" (UniqueName: \"kubernetes.io/projected/baa6487d-4836-485f-91b2-cbb55dc70030-kube-api-access-ljhlw\") on node \"crc\" DevicePath \"\"" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.521701 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa6487d-4836-485f-91b2-cbb55dc70030-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.618416 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h5vsd"] Jan 28 07:04:13 crc kubenswrapper[4776]: E0128 07:04:13.618899 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa6487d-4836-485f-91b2-cbb55dc70030" containerName="registry-server" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.618935 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa6487d-4836-485f-91b2-cbb55dc70030" containerName="registry-server" Jan 28 07:04:13 crc kubenswrapper[4776]: E0128 07:04:13.618996 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa6487d-4836-485f-91b2-cbb55dc70030" containerName="extract-content" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.619009 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa6487d-4836-485f-91b2-cbb55dc70030" containerName="extract-content" Jan 28 07:04:13 crc kubenswrapper[4776]: E0128 07:04:13.619038 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa6487d-4836-485f-91b2-cbb55dc70030" containerName="extract-utilities" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.619052 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa6487d-4836-485f-91b2-cbb55dc70030" containerName="extract-utilities" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.619272 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="baa6487d-4836-485f-91b2-cbb55dc70030" containerName="registry-server" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.625231 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5vsd" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.635968 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5vsd"] Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.729624 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xjf7\" (UniqueName: \"kubernetes.io/projected/a30af7e8-61cf-4234-abaf-7e166b80f632-kube-api-access-8xjf7\") pod \"redhat-marketplace-h5vsd\" (UID: \"a30af7e8-61cf-4234-abaf-7e166b80f632\") " pod="openshift-marketplace/redhat-marketplace-h5vsd" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.730052 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a30af7e8-61cf-4234-abaf-7e166b80f632-catalog-content\") pod \"redhat-marketplace-h5vsd\" (UID: \"a30af7e8-61cf-4234-abaf-7e166b80f632\") " pod="openshift-marketplace/redhat-marketplace-h5vsd" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.730123 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a30af7e8-61cf-4234-abaf-7e166b80f632-utilities\") pod \"redhat-marketplace-h5vsd\" (UID: \"a30af7e8-61cf-4234-abaf-7e166b80f632\") " pod="openshift-marketplace/redhat-marketplace-h5vsd" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.832238 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a30af7e8-61cf-4234-abaf-7e166b80f632-catalog-content\") pod \"redhat-marketplace-h5vsd\" (UID: \"a30af7e8-61cf-4234-abaf-7e166b80f632\") " pod="openshift-marketplace/redhat-marketplace-h5vsd" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.832316 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a30af7e8-61cf-4234-abaf-7e166b80f632-utilities\") pod \"redhat-marketplace-h5vsd\" (UID: \"a30af7e8-61cf-4234-abaf-7e166b80f632\") " pod="openshift-marketplace/redhat-marketplace-h5vsd" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.832458 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xjf7\" (UniqueName: \"kubernetes.io/projected/a30af7e8-61cf-4234-abaf-7e166b80f632-kube-api-access-8xjf7\") pod \"redhat-marketplace-h5vsd\" (UID: \"a30af7e8-61cf-4234-abaf-7e166b80f632\") " pod="openshift-marketplace/redhat-marketplace-h5vsd" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.832792 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a30af7e8-61cf-4234-abaf-7e166b80f632-catalog-content\") pod \"redhat-marketplace-h5vsd\" (UID: \"a30af7e8-61cf-4234-abaf-7e166b80f632\") " pod="openshift-marketplace/redhat-marketplace-h5vsd" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.832922 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a30af7e8-61cf-4234-abaf-7e166b80f632-utilities\") pod \"redhat-marketplace-h5vsd\" (UID: \"a30af7e8-61cf-4234-abaf-7e166b80f632\") " pod="openshift-marketplace/redhat-marketplace-h5vsd" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.852497 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xjf7\" (UniqueName: \"kubernetes.io/projected/a30af7e8-61cf-4234-abaf-7e166b80f632-kube-api-access-8xjf7\") pod \"redhat-marketplace-h5vsd\" (UID: \"a30af7e8-61cf-4234-abaf-7e166b80f632\") " pod="openshift-marketplace/redhat-marketplace-h5vsd" Jan 28 07:04:13 crc kubenswrapper[4776]: I0128 07:04:13.949029 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5vsd" Jan 28 07:04:14 crc kubenswrapper[4776]: I0128 07:04:14.167627 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5vsd"] Jan 28 07:04:14 crc kubenswrapper[4776]: W0128 07:04:14.169714 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda30af7e8_61cf_4234_abaf_7e166b80f632.slice/crio-904f0211c7d20b3b5cc7a93482f88ce0dd1c0194f1240dd1232fa9da6847871b WatchSource:0}: Error finding container 904f0211c7d20b3b5cc7a93482f88ce0dd1c0194f1240dd1232fa9da6847871b: Status 404 returned error can't find the container with id 904f0211c7d20b3b5cc7a93482f88ce0dd1c0194f1240dd1232fa9da6847871b Jan 28 07:04:14 crc kubenswrapper[4776]: I0128 07:04:14.241188 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jn6pq" event={"ID":"55a7362c-68b7-4443-98c0-0d79dbf5b912","Type":"ContainerStarted","Data":"11665597dbdcf52760fee78440d507d974cbc1eb700d4c678b4768937b8df0b0"} Jan 28 07:04:14 crc kubenswrapper[4776]: I0128 07:04:14.242619 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5vsd" event={"ID":"a30af7e8-61cf-4234-abaf-7e166b80f632","Type":"ContainerStarted","Data":"904f0211c7d20b3b5cc7a93482f88ce0dd1c0194f1240dd1232fa9da6847871b"} Jan 28 07:04:14 crc kubenswrapper[4776]: I0128 07:04:14.242746 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-568sw" Jan 28 07:04:14 crc kubenswrapper[4776]: I0128 07:04:14.260733 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jn6pq" podStartSLOduration=2.801079635 podStartE2EDuration="5.260720446s" podCreationTimestamp="2026-01-28 07:04:09 +0000 UTC" firstStartedPulling="2026-01-28 07:04:11.209497459 +0000 UTC m=+822.625157659" lastFinishedPulling="2026-01-28 07:04:13.66913831 +0000 UTC m=+825.084798470" observedRunningTime="2026-01-28 07:04:14.257007035 +0000 UTC m=+825.672667195" watchObservedRunningTime="2026-01-28 07:04:14.260720446 +0000 UTC m=+825.676380606" Jan 28 07:04:14 crc kubenswrapper[4776]: I0128 07:04:14.269728 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-njlm4" Jan 28 07:04:14 crc kubenswrapper[4776]: I0128 07:04:14.301053 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-568sw"] Jan 28 07:04:14 crc kubenswrapper[4776]: I0128 07:04:14.305219 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-568sw"] Jan 28 07:04:15 crc kubenswrapper[4776]: I0128 07:04:15.248623 4776 generic.go:334] "Generic (PLEG): container finished" podID="a30af7e8-61cf-4234-abaf-7e166b80f632" containerID="050a337710ce7147cb9048f869f94f14abb1d04cbc78b193f7ae2b3028592344" exitCode=0 Jan 28 07:04:15 crc kubenswrapper[4776]: I0128 07:04:15.248672 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5vsd" event={"ID":"a30af7e8-61cf-4234-abaf-7e166b80f632","Type":"ContainerDied","Data":"050a337710ce7147cb9048f869f94f14abb1d04cbc78b193f7ae2b3028592344"} Jan 28 07:04:15 crc kubenswrapper[4776]: I0128 07:04:15.315621 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baa6487d-4836-485f-91b2-cbb55dc70030" path="/var/lib/kubelet/pods/baa6487d-4836-485f-91b2-cbb55dc70030/volumes" Jan 28 07:04:16 crc kubenswrapper[4776]: I0128 07:04:16.683402 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v"] Jan 28 07:04:16 crc kubenswrapper[4776]: I0128 07:04:16.684757 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" Jan 28 07:04:16 crc kubenswrapper[4776]: I0128 07:04:16.690915 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-kxktp" Jan 28 07:04:16 crc kubenswrapper[4776]: I0128 07:04:16.703350 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v"] Jan 28 07:04:16 crc kubenswrapper[4776]: I0128 07:04:16.871337 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29726b5f-7cef-4a70-8004-88f628782852-util\") pod \"f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v\" (UID: \"29726b5f-7cef-4a70-8004-88f628782852\") " pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" Jan 28 07:04:16 crc kubenswrapper[4776]: I0128 07:04:16.871404 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w54b6\" (UniqueName: \"kubernetes.io/projected/29726b5f-7cef-4a70-8004-88f628782852-kube-api-access-w54b6\") pod \"f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v\" (UID: \"29726b5f-7cef-4a70-8004-88f628782852\") " pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" Jan 28 07:04:16 crc kubenswrapper[4776]: I0128 07:04:16.871442 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29726b5f-7cef-4a70-8004-88f628782852-bundle\") pod \"f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v\" (UID: \"29726b5f-7cef-4a70-8004-88f628782852\") " pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" Jan 28 07:04:16 crc kubenswrapper[4776]: I0128 07:04:16.973263 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w54b6\" (UniqueName: \"kubernetes.io/projected/29726b5f-7cef-4a70-8004-88f628782852-kube-api-access-w54b6\") pod \"f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v\" (UID: \"29726b5f-7cef-4a70-8004-88f628782852\") " pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" Jan 28 07:04:16 crc kubenswrapper[4776]: I0128 07:04:16.973321 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29726b5f-7cef-4a70-8004-88f628782852-bundle\") pod \"f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v\" (UID: \"29726b5f-7cef-4a70-8004-88f628782852\") " pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" Jan 28 07:04:16 crc kubenswrapper[4776]: I0128 07:04:16.973377 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29726b5f-7cef-4a70-8004-88f628782852-util\") pod \"f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v\" (UID: \"29726b5f-7cef-4a70-8004-88f628782852\") " pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" Jan 28 07:04:16 crc kubenswrapper[4776]: I0128 07:04:16.973824 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29726b5f-7cef-4a70-8004-88f628782852-util\") pod \"f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v\" (UID: \"29726b5f-7cef-4a70-8004-88f628782852\") " pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" Jan 28 07:04:16 crc kubenswrapper[4776]: I0128 07:04:16.973881 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29726b5f-7cef-4a70-8004-88f628782852-bundle\") pod \"f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v\" (UID: \"29726b5f-7cef-4a70-8004-88f628782852\") " pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" Jan 28 07:04:16 crc kubenswrapper[4776]: I0128 07:04:16.991668 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w54b6\" (UniqueName: \"kubernetes.io/projected/29726b5f-7cef-4a70-8004-88f628782852-kube-api-access-w54b6\") pod \"f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v\" (UID: \"29726b5f-7cef-4a70-8004-88f628782852\") " pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" Jan 28 07:04:17 crc kubenswrapper[4776]: I0128 07:04:17.000536 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" Jan 28 07:04:17 crc kubenswrapper[4776]: I0128 07:04:17.432386 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v"] Jan 28 07:04:17 crc kubenswrapper[4776]: W0128 07:04:17.440815 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29726b5f_7cef_4a70_8004_88f628782852.slice/crio-82da77b04929665957521dc407d7599070871c99cfe380dcf7ea7298ce31f6a3 WatchSource:0}: Error finding container 82da77b04929665957521dc407d7599070871c99cfe380dcf7ea7298ce31f6a3: Status 404 returned error can't find the container with id 82da77b04929665957521dc407d7599070871c99cfe380dcf7ea7298ce31f6a3 Jan 28 07:04:18 crc kubenswrapper[4776]: I0128 07:04:18.274614 4776 generic.go:334] "Generic (PLEG): container finished" podID="a30af7e8-61cf-4234-abaf-7e166b80f632" containerID="a0637fde6ee9e080502af1ba74ec1075a3bf93be66874641c932a77888ab18a2" exitCode=0 Jan 28 07:04:18 crc kubenswrapper[4776]: I0128 07:04:18.274694 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5vsd" event={"ID":"a30af7e8-61cf-4234-abaf-7e166b80f632","Type":"ContainerDied","Data":"a0637fde6ee9e080502af1ba74ec1075a3bf93be66874641c932a77888ab18a2"} Jan 28 07:04:18 crc kubenswrapper[4776]: I0128 07:04:18.278561 4776 generic.go:334] "Generic (PLEG): container finished" podID="29726b5f-7cef-4a70-8004-88f628782852" containerID="20f1cdfd63410f77f8cc3087f8eed852627c6d1d9131be6d1b210094df181ac7" exitCode=0 Jan 28 07:04:18 crc kubenswrapper[4776]: I0128 07:04:18.278587 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" event={"ID":"29726b5f-7cef-4a70-8004-88f628782852","Type":"ContainerDied","Data":"20f1cdfd63410f77f8cc3087f8eed852627c6d1d9131be6d1b210094df181ac7"} Jan 28 07:04:18 crc kubenswrapper[4776]: I0128 07:04:18.278638 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" event={"ID":"29726b5f-7cef-4a70-8004-88f628782852","Type":"ContainerStarted","Data":"82da77b04929665957521dc407d7599070871c99cfe380dcf7ea7298ce31f6a3"} Jan 28 07:04:19 crc kubenswrapper[4776]: I0128 07:04:19.285471 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5vsd" event={"ID":"a30af7e8-61cf-4234-abaf-7e166b80f632","Type":"ContainerStarted","Data":"91faa1e11f6bb8d1e980ac12ae8db25a3a20ae75754348b268363428a8909fd8"} Jan 28 07:04:19 crc kubenswrapper[4776]: I0128 07:04:19.286999 4776 generic.go:334] "Generic (PLEG): container finished" podID="29726b5f-7cef-4a70-8004-88f628782852" containerID="256806df6ca7b2eb0086f3d73bc833a90046d0d4640ef99033db067fbc728d97" exitCode=0 Jan 28 07:04:19 crc kubenswrapper[4776]: I0128 07:04:19.287037 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" event={"ID":"29726b5f-7cef-4a70-8004-88f628782852","Type":"ContainerDied","Data":"256806df6ca7b2eb0086f3d73bc833a90046d0d4640ef99033db067fbc728d97"} Jan 28 07:04:19 crc kubenswrapper[4776]: I0128 07:04:19.309798 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h5vsd" podStartSLOduration=2.810670709 podStartE2EDuration="6.309780724s" podCreationTimestamp="2026-01-28 07:04:13 +0000 UTC" firstStartedPulling="2026-01-28 07:04:15.251077322 +0000 UTC m=+826.666737482" lastFinishedPulling="2026-01-28 07:04:18.750187337 +0000 UTC m=+830.165847497" observedRunningTime="2026-01-28 07:04:19.306703599 +0000 UTC m=+830.722363759" watchObservedRunningTime="2026-01-28 07:04:19.309780724 +0000 UTC m=+830.725440884" Jan 28 07:04:19 crc kubenswrapper[4776]: I0128 07:04:19.540472 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jn6pq" Jan 28 07:04:19 crc kubenswrapper[4776]: I0128 07:04:19.540774 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jn6pq" Jan 28 07:04:19 crc kubenswrapper[4776]: I0128 07:04:19.577704 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jn6pq" Jan 28 07:04:20 crc kubenswrapper[4776]: I0128 07:04:20.294910 4776 generic.go:334] "Generic (PLEG): container finished" podID="29726b5f-7cef-4a70-8004-88f628782852" containerID="8390ba273d6691b6bfb9b7b038eee12996958992286ae29142115afc34520373" exitCode=0 Jan 28 07:04:20 crc kubenswrapper[4776]: I0128 07:04:20.294943 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" event={"ID":"29726b5f-7cef-4a70-8004-88f628782852","Type":"ContainerDied","Data":"8390ba273d6691b6bfb9b7b038eee12996958992286ae29142115afc34520373"} Jan 28 07:04:20 crc kubenswrapper[4776]: I0128 07:04:20.346467 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jn6pq" Jan 28 07:04:21 crc kubenswrapper[4776]: I0128 07:04:21.641916 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" Jan 28 07:04:21 crc kubenswrapper[4776]: I0128 07:04:21.829747 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29726b5f-7cef-4a70-8004-88f628782852-bundle\") pod \"29726b5f-7cef-4a70-8004-88f628782852\" (UID: \"29726b5f-7cef-4a70-8004-88f628782852\") " Jan 28 07:04:21 crc kubenswrapper[4776]: I0128 07:04:21.829853 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w54b6\" (UniqueName: \"kubernetes.io/projected/29726b5f-7cef-4a70-8004-88f628782852-kube-api-access-w54b6\") pod \"29726b5f-7cef-4a70-8004-88f628782852\" (UID: \"29726b5f-7cef-4a70-8004-88f628782852\") " Jan 28 07:04:21 crc kubenswrapper[4776]: I0128 07:04:21.829998 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29726b5f-7cef-4a70-8004-88f628782852-util\") pod \"29726b5f-7cef-4a70-8004-88f628782852\" (UID: \"29726b5f-7cef-4a70-8004-88f628782852\") " Jan 28 07:04:21 crc kubenswrapper[4776]: I0128 07:04:21.831268 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29726b5f-7cef-4a70-8004-88f628782852-bundle" (OuterVolumeSpecName: "bundle") pod "29726b5f-7cef-4a70-8004-88f628782852" (UID: "29726b5f-7cef-4a70-8004-88f628782852"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:04:21 crc kubenswrapper[4776]: I0128 07:04:21.840535 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29726b5f-7cef-4a70-8004-88f628782852-kube-api-access-w54b6" (OuterVolumeSpecName: "kube-api-access-w54b6") pod "29726b5f-7cef-4a70-8004-88f628782852" (UID: "29726b5f-7cef-4a70-8004-88f628782852"). InnerVolumeSpecName "kube-api-access-w54b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:04:21 crc kubenswrapper[4776]: I0128 07:04:21.860687 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29726b5f-7cef-4a70-8004-88f628782852-util" (OuterVolumeSpecName: "util") pod "29726b5f-7cef-4a70-8004-88f628782852" (UID: "29726b5f-7cef-4a70-8004-88f628782852"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:04:21 crc kubenswrapper[4776]: I0128 07:04:21.938706 4776 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29726b5f-7cef-4a70-8004-88f628782852-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:04:21 crc kubenswrapper[4776]: I0128 07:04:21.938771 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w54b6\" (UniqueName: \"kubernetes.io/projected/29726b5f-7cef-4a70-8004-88f628782852-kube-api-access-w54b6\") on node \"crc\" DevicePath \"\"" Jan 28 07:04:21 crc kubenswrapper[4776]: I0128 07:04:21.938802 4776 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29726b5f-7cef-4a70-8004-88f628782852-util\") on node \"crc\" DevicePath \"\"" Jan 28 07:04:22 crc kubenswrapper[4776]: I0128 07:04:22.313592 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" event={"ID":"29726b5f-7cef-4a70-8004-88f628782852","Type":"ContainerDied","Data":"82da77b04929665957521dc407d7599070871c99cfe380dcf7ea7298ce31f6a3"} Jan 28 07:04:22 crc kubenswrapper[4776]: I0128 07:04:22.313875 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82da77b04929665957521dc407d7599070871c99cfe380dcf7ea7298ce31f6a3" Jan 28 07:04:22 crc kubenswrapper[4776]: I0128 07:04:22.313663 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v" Jan 28 07:04:23 crc kubenswrapper[4776]: I0128 07:04:23.011333 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jn6pq"] Jan 28 07:04:23 crc kubenswrapper[4776]: I0128 07:04:23.011867 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jn6pq" podUID="55a7362c-68b7-4443-98c0-0d79dbf5b912" containerName="registry-server" containerID="cri-o://11665597dbdcf52760fee78440d507d974cbc1eb700d4c678b4768937b8df0b0" gracePeriod=2 Jan 28 07:04:23 crc kubenswrapper[4776]: I0128 07:04:23.949980 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h5vsd" Jan 28 07:04:23 crc kubenswrapper[4776]: I0128 07:04:23.950387 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h5vsd" Jan 28 07:04:23 crc kubenswrapper[4776]: I0128 07:04:23.991746 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jn6pq" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.009034 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h5vsd" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.163353 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a7362c-68b7-4443-98c0-0d79dbf5b912-utilities\") pod \"55a7362c-68b7-4443-98c0-0d79dbf5b912\" (UID: \"55a7362c-68b7-4443-98c0-0d79dbf5b912\") " Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.163464 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a7362c-68b7-4443-98c0-0d79dbf5b912-catalog-content\") pod \"55a7362c-68b7-4443-98c0-0d79dbf5b912\" (UID: \"55a7362c-68b7-4443-98c0-0d79dbf5b912\") " Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.163494 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j757\" (UniqueName: \"kubernetes.io/projected/55a7362c-68b7-4443-98c0-0d79dbf5b912-kube-api-access-6j757\") pod \"55a7362c-68b7-4443-98c0-0d79dbf5b912\" (UID: \"55a7362c-68b7-4443-98c0-0d79dbf5b912\") " Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.164721 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a7362c-68b7-4443-98c0-0d79dbf5b912-utilities" (OuterVolumeSpecName: "utilities") pod "55a7362c-68b7-4443-98c0-0d79dbf5b912" (UID: "55a7362c-68b7-4443-98c0-0d79dbf5b912"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.175816 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a7362c-68b7-4443-98c0-0d79dbf5b912-kube-api-access-6j757" (OuterVolumeSpecName: "kube-api-access-6j757") pod "55a7362c-68b7-4443-98c0-0d79dbf5b912" (UID: "55a7362c-68b7-4443-98c0-0d79dbf5b912"). InnerVolumeSpecName "kube-api-access-6j757". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.220482 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a7362c-68b7-4443-98c0-0d79dbf5b912-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55a7362c-68b7-4443-98c0-0d79dbf5b912" (UID: "55a7362c-68b7-4443-98c0-0d79dbf5b912"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.264937 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a7362c-68b7-4443-98c0-0d79dbf5b912-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.264976 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a7362c-68b7-4443-98c0-0d79dbf5b912-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.264991 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j757\" (UniqueName: \"kubernetes.io/projected/55a7362c-68b7-4443-98c0-0d79dbf5b912-kube-api-access-6j757\") on node \"crc\" DevicePath \"\"" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.329731 4776 generic.go:334] "Generic (PLEG): container finished" podID="55a7362c-68b7-4443-98c0-0d79dbf5b912" containerID="11665597dbdcf52760fee78440d507d974cbc1eb700d4c678b4768937b8df0b0" exitCode=0 Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.330793 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jn6pq" event={"ID":"55a7362c-68b7-4443-98c0-0d79dbf5b912","Type":"ContainerDied","Data":"11665597dbdcf52760fee78440d507d974cbc1eb700d4c678b4768937b8df0b0"} Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.330821 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jn6pq" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.330860 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jn6pq" event={"ID":"55a7362c-68b7-4443-98c0-0d79dbf5b912","Type":"ContainerDied","Data":"60fcbb4a6c090f6716fa669b19ed2554aa4e8cce303a6641918e89a586a5408f"} Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.330892 4776 scope.go:117] "RemoveContainer" containerID="11665597dbdcf52760fee78440d507d974cbc1eb700d4c678b4768937b8df0b0" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.373167 4776 scope.go:117] "RemoveContainer" containerID="bb90850db61df6d7b8146f26a41b16da5ae1a1757af68fc1d18b8ab36129c74c" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.380133 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jn6pq"] Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.387470 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jn6pq"] Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.414607 4776 scope.go:117] "RemoveContainer" containerID="02ebc6561655868aa55bcacdbffac65effca7d7e9352f0842cdcab9f204883f0" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.431663 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h5vsd" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.439152 4776 scope.go:117] "RemoveContainer" containerID="11665597dbdcf52760fee78440d507d974cbc1eb700d4c678b4768937b8df0b0" Jan 28 07:04:24 crc kubenswrapper[4776]: E0128 07:04:24.440253 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11665597dbdcf52760fee78440d507d974cbc1eb700d4c678b4768937b8df0b0\": container with ID starting with 11665597dbdcf52760fee78440d507d974cbc1eb700d4c678b4768937b8df0b0 not found: ID does not exist" containerID="11665597dbdcf52760fee78440d507d974cbc1eb700d4c678b4768937b8df0b0" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.440282 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11665597dbdcf52760fee78440d507d974cbc1eb700d4c678b4768937b8df0b0"} err="failed to get container status \"11665597dbdcf52760fee78440d507d974cbc1eb700d4c678b4768937b8df0b0\": rpc error: code = NotFound desc = could not find container \"11665597dbdcf52760fee78440d507d974cbc1eb700d4c678b4768937b8df0b0\": container with ID starting with 11665597dbdcf52760fee78440d507d974cbc1eb700d4c678b4768937b8df0b0 not found: ID does not exist" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.440302 4776 scope.go:117] "RemoveContainer" containerID="bb90850db61df6d7b8146f26a41b16da5ae1a1757af68fc1d18b8ab36129c74c" Jan 28 07:04:24 crc kubenswrapper[4776]: E0128 07:04:24.440559 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb90850db61df6d7b8146f26a41b16da5ae1a1757af68fc1d18b8ab36129c74c\": container with ID starting with bb90850db61df6d7b8146f26a41b16da5ae1a1757af68fc1d18b8ab36129c74c not found: ID does not exist" containerID="bb90850db61df6d7b8146f26a41b16da5ae1a1757af68fc1d18b8ab36129c74c" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.440611 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb90850db61df6d7b8146f26a41b16da5ae1a1757af68fc1d18b8ab36129c74c"} err="failed to get container status \"bb90850db61df6d7b8146f26a41b16da5ae1a1757af68fc1d18b8ab36129c74c\": rpc error: code = NotFound desc = could not find container \"bb90850db61df6d7b8146f26a41b16da5ae1a1757af68fc1d18b8ab36129c74c\": container with ID starting with bb90850db61df6d7b8146f26a41b16da5ae1a1757af68fc1d18b8ab36129c74c not found: ID does not exist" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.440626 4776 scope.go:117] "RemoveContainer" containerID="02ebc6561655868aa55bcacdbffac65effca7d7e9352f0842cdcab9f204883f0" Jan 28 07:04:24 crc kubenswrapper[4776]: E0128 07:04:24.440913 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02ebc6561655868aa55bcacdbffac65effca7d7e9352f0842cdcab9f204883f0\": container with ID starting with 02ebc6561655868aa55bcacdbffac65effca7d7e9352f0842cdcab9f204883f0 not found: ID does not exist" containerID="02ebc6561655868aa55bcacdbffac65effca7d7e9352f0842cdcab9f204883f0" Jan 28 07:04:24 crc kubenswrapper[4776]: I0128 07:04:24.440935 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ebc6561655868aa55bcacdbffac65effca7d7e9352f0842cdcab9f204883f0"} err="failed to get container status \"02ebc6561655868aa55bcacdbffac65effca7d7e9352f0842cdcab9f204883f0\": rpc error: code = NotFound desc = could not find container \"02ebc6561655868aa55bcacdbffac65effca7d7e9352f0842cdcab9f204883f0\": container with ID starting with 02ebc6561655868aa55bcacdbffac65effca7d7e9352f0842cdcab9f204883f0 not found: ID does not exist" Jan 28 07:04:25 crc kubenswrapper[4776]: I0128 07:04:25.313011 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a7362c-68b7-4443-98c0-0d79dbf5b912" path="/var/lib/kubelet/pods/55a7362c-68b7-4443-98c0-0d79dbf5b912/volumes" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.205876 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5vsd"] Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.346187 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h5vsd" podUID="a30af7e8-61cf-4234-abaf-7e166b80f632" containerName="registry-server" containerID="cri-o://91faa1e11f6bb8d1e980ac12ae8db25a3a20ae75754348b268363428a8909fd8" gracePeriod=2 Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.627405 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5667c869b5-csrzg"] Jan 28 07:04:26 crc kubenswrapper[4776]: E0128 07:04:26.627705 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29726b5f-7cef-4a70-8004-88f628782852" containerName="pull" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.627732 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="29726b5f-7cef-4a70-8004-88f628782852" containerName="pull" Jan 28 07:04:26 crc kubenswrapper[4776]: E0128 07:04:26.627743 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29726b5f-7cef-4a70-8004-88f628782852" containerName="extract" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.627751 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="29726b5f-7cef-4a70-8004-88f628782852" containerName="extract" Jan 28 07:04:26 crc kubenswrapper[4776]: E0128 07:04:26.627762 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a7362c-68b7-4443-98c0-0d79dbf5b912" containerName="extract-utilities" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.627770 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a7362c-68b7-4443-98c0-0d79dbf5b912" containerName="extract-utilities" Jan 28 07:04:26 crc kubenswrapper[4776]: E0128 07:04:26.627787 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a7362c-68b7-4443-98c0-0d79dbf5b912" containerName="extract-content" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.627794 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a7362c-68b7-4443-98c0-0d79dbf5b912" containerName="extract-content" Jan 28 07:04:26 crc kubenswrapper[4776]: E0128 07:04:26.627805 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29726b5f-7cef-4a70-8004-88f628782852" containerName="util" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.627812 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="29726b5f-7cef-4a70-8004-88f628782852" containerName="util" Jan 28 07:04:26 crc kubenswrapper[4776]: E0128 07:04:26.627830 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a7362c-68b7-4443-98c0-0d79dbf5b912" containerName="registry-server" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.627837 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a7362c-68b7-4443-98c0-0d79dbf5b912" containerName="registry-server" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.627963 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="29726b5f-7cef-4a70-8004-88f628782852" containerName="extract" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.627981 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a7362c-68b7-4443-98c0-0d79dbf5b912" containerName="registry-server" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.629010 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5667c869b5-csrzg" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.647893 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-22mz8" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.664787 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5667c869b5-csrzg"] Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.741786 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5vsd" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.802800 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwkhq\" (UniqueName: \"kubernetes.io/projected/bca7b855-4473-4cc2-aa88-38fd3de8fea8-kube-api-access-wwkhq\") pod \"openstack-operator-controller-init-5667c869b5-csrzg\" (UID: \"bca7b855-4473-4cc2-aa88-38fd3de8fea8\") " pod="openstack-operators/openstack-operator-controller-init-5667c869b5-csrzg" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.903881 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a30af7e8-61cf-4234-abaf-7e166b80f632-utilities\") pod \"a30af7e8-61cf-4234-abaf-7e166b80f632\" (UID: \"a30af7e8-61cf-4234-abaf-7e166b80f632\") " Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.904146 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xjf7\" (UniqueName: \"kubernetes.io/projected/a30af7e8-61cf-4234-abaf-7e166b80f632-kube-api-access-8xjf7\") pod \"a30af7e8-61cf-4234-abaf-7e166b80f632\" (UID: \"a30af7e8-61cf-4234-abaf-7e166b80f632\") " Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.904219 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a30af7e8-61cf-4234-abaf-7e166b80f632-catalog-content\") pod \"a30af7e8-61cf-4234-abaf-7e166b80f632\" (UID: \"a30af7e8-61cf-4234-abaf-7e166b80f632\") " Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.904528 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwkhq\" (UniqueName: \"kubernetes.io/projected/bca7b855-4473-4cc2-aa88-38fd3de8fea8-kube-api-access-wwkhq\") pod \"openstack-operator-controller-init-5667c869b5-csrzg\" (UID: \"bca7b855-4473-4cc2-aa88-38fd3de8fea8\") " pod="openstack-operators/openstack-operator-controller-init-5667c869b5-csrzg" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.904737 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a30af7e8-61cf-4234-abaf-7e166b80f632-utilities" (OuterVolumeSpecName: "utilities") pod "a30af7e8-61cf-4234-abaf-7e166b80f632" (UID: "a30af7e8-61cf-4234-abaf-7e166b80f632"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.920770 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a30af7e8-61cf-4234-abaf-7e166b80f632-kube-api-access-8xjf7" (OuterVolumeSpecName: "kube-api-access-8xjf7") pod "a30af7e8-61cf-4234-abaf-7e166b80f632" (UID: "a30af7e8-61cf-4234-abaf-7e166b80f632"). InnerVolumeSpecName "kube-api-access-8xjf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.932827 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a30af7e8-61cf-4234-abaf-7e166b80f632-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a30af7e8-61cf-4234-abaf-7e166b80f632" (UID: "a30af7e8-61cf-4234-abaf-7e166b80f632"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.942056 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwkhq\" (UniqueName: \"kubernetes.io/projected/bca7b855-4473-4cc2-aa88-38fd3de8fea8-kube-api-access-wwkhq\") pod \"openstack-operator-controller-init-5667c869b5-csrzg\" (UID: \"bca7b855-4473-4cc2-aa88-38fd3de8fea8\") " pod="openstack-operators/openstack-operator-controller-init-5667c869b5-csrzg" Jan 28 07:04:26 crc kubenswrapper[4776]: I0128 07:04:26.979176 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5667c869b5-csrzg" Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.005384 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a30af7e8-61cf-4234-abaf-7e166b80f632-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.005426 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xjf7\" (UniqueName: \"kubernetes.io/projected/a30af7e8-61cf-4234-abaf-7e166b80f632-kube-api-access-8xjf7\") on node \"crc\" DevicePath \"\"" Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.005440 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a30af7e8-61cf-4234-abaf-7e166b80f632-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.342352 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5667c869b5-csrzg"] Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.360654 4776 generic.go:334] "Generic (PLEG): container finished" podID="a30af7e8-61cf-4234-abaf-7e166b80f632" containerID="91faa1e11f6bb8d1e980ac12ae8db25a3a20ae75754348b268363428a8909fd8" exitCode=0 Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.360713 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5vsd" Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.360736 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5vsd" event={"ID":"a30af7e8-61cf-4234-abaf-7e166b80f632","Type":"ContainerDied","Data":"91faa1e11f6bb8d1e980ac12ae8db25a3a20ae75754348b268363428a8909fd8"} Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.361707 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5vsd" event={"ID":"a30af7e8-61cf-4234-abaf-7e166b80f632","Type":"ContainerDied","Data":"904f0211c7d20b3b5cc7a93482f88ce0dd1c0194f1240dd1232fa9da6847871b"} Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.361739 4776 scope.go:117] "RemoveContainer" containerID="91faa1e11f6bb8d1e980ac12ae8db25a3a20ae75754348b268363428a8909fd8" Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.363102 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5667c869b5-csrzg" event={"ID":"bca7b855-4473-4cc2-aa88-38fd3de8fea8","Type":"ContainerStarted","Data":"1e15a955f12f52cd0ab7f69a56a095a6c5ae2f9cde682d521633cf3488d42598"} Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.382830 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5vsd"] Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.383562 4776 scope.go:117] "RemoveContainer" containerID="a0637fde6ee9e080502af1ba74ec1075a3bf93be66874641c932a77888ab18a2" Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.387678 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5vsd"] Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.409817 4776 scope.go:117] "RemoveContainer" containerID="050a337710ce7147cb9048f869f94f14abb1d04cbc78b193f7ae2b3028592344" Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.428205 4776 scope.go:117] "RemoveContainer" containerID="91faa1e11f6bb8d1e980ac12ae8db25a3a20ae75754348b268363428a8909fd8" Jan 28 07:04:27 crc kubenswrapper[4776]: E0128 07:04:27.428603 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91faa1e11f6bb8d1e980ac12ae8db25a3a20ae75754348b268363428a8909fd8\": container with ID starting with 91faa1e11f6bb8d1e980ac12ae8db25a3a20ae75754348b268363428a8909fd8 not found: ID does not exist" containerID="91faa1e11f6bb8d1e980ac12ae8db25a3a20ae75754348b268363428a8909fd8" Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.428685 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91faa1e11f6bb8d1e980ac12ae8db25a3a20ae75754348b268363428a8909fd8"} err="failed to get container status \"91faa1e11f6bb8d1e980ac12ae8db25a3a20ae75754348b268363428a8909fd8\": rpc error: code = NotFound desc = could not find container \"91faa1e11f6bb8d1e980ac12ae8db25a3a20ae75754348b268363428a8909fd8\": container with ID starting with 91faa1e11f6bb8d1e980ac12ae8db25a3a20ae75754348b268363428a8909fd8 not found: ID does not exist" Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.428712 4776 scope.go:117] "RemoveContainer" containerID="a0637fde6ee9e080502af1ba74ec1075a3bf93be66874641c932a77888ab18a2" Jan 28 07:04:27 crc kubenswrapper[4776]: E0128 07:04:27.429149 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0637fde6ee9e080502af1ba74ec1075a3bf93be66874641c932a77888ab18a2\": container with ID starting with a0637fde6ee9e080502af1ba74ec1075a3bf93be66874641c932a77888ab18a2 not found: ID does not exist" containerID="a0637fde6ee9e080502af1ba74ec1075a3bf93be66874641c932a77888ab18a2" Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.429174 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0637fde6ee9e080502af1ba74ec1075a3bf93be66874641c932a77888ab18a2"} err="failed to get container status \"a0637fde6ee9e080502af1ba74ec1075a3bf93be66874641c932a77888ab18a2\": rpc error: code = NotFound desc = could not find container \"a0637fde6ee9e080502af1ba74ec1075a3bf93be66874641c932a77888ab18a2\": container with ID starting with a0637fde6ee9e080502af1ba74ec1075a3bf93be66874641c932a77888ab18a2 not found: ID does not exist" Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.429226 4776 scope.go:117] "RemoveContainer" containerID="050a337710ce7147cb9048f869f94f14abb1d04cbc78b193f7ae2b3028592344" Jan 28 07:04:27 crc kubenswrapper[4776]: E0128 07:04:27.429462 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"050a337710ce7147cb9048f869f94f14abb1d04cbc78b193f7ae2b3028592344\": container with ID starting with 050a337710ce7147cb9048f869f94f14abb1d04cbc78b193f7ae2b3028592344 not found: ID does not exist" containerID="050a337710ce7147cb9048f869f94f14abb1d04cbc78b193f7ae2b3028592344" Jan 28 07:04:27 crc kubenswrapper[4776]: I0128 07:04:27.429487 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050a337710ce7147cb9048f869f94f14abb1d04cbc78b193f7ae2b3028592344"} err="failed to get container status \"050a337710ce7147cb9048f869f94f14abb1d04cbc78b193f7ae2b3028592344\": rpc error: code = NotFound desc = could not find container \"050a337710ce7147cb9048f869f94f14abb1d04cbc78b193f7ae2b3028592344\": container with ID starting with 050a337710ce7147cb9048f869f94f14abb1d04cbc78b193f7ae2b3028592344 not found: ID does not exist" Jan 28 07:04:29 crc kubenswrapper[4776]: I0128 07:04:29.320890 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a30af7e8-61cf-4234-abaf-7e166b80f632" path="/var/lib/kubelet/pods/a30af7e8-61cf-4234-abaf-7e166b80f632/volumes" Jan 28 07:04:31 crc kubenswrapper[4776]: I0128 07:04:31.396331 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5667c869b5-csrzg" event={"ID":"bca7b855-4473-4cc2-aa88-38fd3de8fea8","Type":"ContainerStarted","Data":"c3c73a45f4bb91de63d7a81722653012ab5af5e6bd1521154e0741624bc573b6"} Jan 28 07:04:31 crc kubenswrapper[4776]: I0128 07:04:31.397786 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5667c869b5-csrzg" Jan 28 07:04:31 crc kubenswrapper[4776]: I0128 07:04:31.428050 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5667c869b5-csrzg" podStartSLOduration=1.67245987 podStartE2EDuration="5.428028542s" podCreationTimestamp="2026-01-28 07:04:26 +0000 UTC" firstStartedPulling="2026-01-28 07:04:27.350613145 +0000 UTC m=+838.766273305" lastFinishedPulling="2026-01-28 07:04:31.106181817 +0000 UTC m=+842.521841977" observedRunningTime="2026-01-28 07:04:31.42612817 +0000 UTC m=+842.841788380" watchObservedRunningTime="2026-01-28 07:04:31.428028542 +0000 UTC m=+842.843688702" Jan 28 07:04:33 crc kubenswrapper[4776]: I0128 07:04:33.852120 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:04:33 crc kubenswrapper[4776]: I0128 07:04:33.852418 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:04:33 crc kubenswrapper[4776]: I0128 07:04:33.852479 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 07:04:33 crc kubenswrapper[4776]: I0128 07:04:33.853164 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a4b923d003b08375151203705e90fc5cb4620832d4a2d02a6cb87b79047a42d"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 07:04:33 crc kubenswrapper[4776]: I0128 07:04:33.853222 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://3a4b923d003b08375151203705e90fc5cb4620832d4a2d02a6cb87b79047a42d" gracePeriod=600 Jan 28 07:04:34 crc kubenswrapper[4776]: I0128 07:04:34.420024 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="3a4b923d003b08375151203705e90fc5cb4620832d4a2d02a6cb87b79047a42d" exitCode=0 Jan 28 07:04:34 crc kubenswrapper[4776]: I0128 07:04:34.420338 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"3a4b923d003b08375151203705e90fc5cb4620832d4a2d02a6cb87b79047a42d"} Jan 28 07:04:34 crc kubenswrapper[4776]: I0128 07:04:34.420422 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"ee9888a6c6a796ef3ecd16fb5509f4cb1473705dc001b450840175052867c944"} Jan 28 07:04:34 crc kubenswrapper[4776]: I0128 07:04:34.420450 4776 scope.go:117] "RemoveContainer" containerID="de406494f5986cb272819651fdda864d086b81af18822e3493914679a641f0e0" Jan 28 07:04:36 crc kubenswrapper[4776]: I0128 07:04:36.984677 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5667c869b5-csrzg" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.658047 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-655bf9cfbb-wjdst"] Jan 28 07:04:55 crc kubenswrapper[4776]: E0128 07:04:55.658783 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30af7e8-61cf-4234-abaf-7e166b80f632" containerName="extract-utilities" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.658795 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30af7e8-61cf-4234-abaf-7e166b80f632" containerName="extract-utilities" Jan 28 07:04:55 crc kubenswrapper[4776]: E0128 07:04:55.658807 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30af7e8-61cf-4234-abaf-7e166b80f632" containerName="extract-content" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.658813 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30af7e8-61cf-4234-abaf-7e166b80f632" containerName="extract-content" Jan 28 07:04:55 crc kubenswrapper[4776]: E0128 07:04:55.658822 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30af7e8-61cf-4234-abaf-7e166b80f632" containerName="registry-server" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.658827 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30af7e8-61cf-4234-abaf-7e166b80f632" containerName="registry-server" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.658946 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30af7e8-61cf-4234-abaf-7e166b80f632" containerName="registry-server" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.659466 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-wjdst" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.661023 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-kjqd2" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.664566 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-65ff799cfd-vzbmt"] Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.665595 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-vzbmt" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.667105 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-jj6nb" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.670427 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-65ff799cfd-vzbmt"] Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.676328 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-655bf9cfbb-wjdst"] Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.695741 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-77554cdc5c-vpc6t"] Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.696581 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-vpc6t" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.703926 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-dvbh6" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.711425 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-77554cdc5c-vpc6t"] Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.720355 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-67dd55ff59-xw2v6"] Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.721234 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-xw2v6" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.724649 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-92v6l" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.736494 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-67dd55ff59-xw2v6"] Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.750476 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-575ffb885b-pxsb4"] Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.751225 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-575ffb885b-pxsb4" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.753528 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-8vp45" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.758132 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpqnc\" (UniqueName: \"kubernetes.io/projected/79937ab5-c85f-4a4a-b35f-3b5d3711cbf0-kube-api-access-bpqnc\") pod \"cinder-operator-controller-manager-655bf9cfbb-wjdst\" (UID: \"79937ab5-c85f-4a4a-b35f-3b5d3711cbf0\") " pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-wjdst" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.758195 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgqkt\" (UniqueName: \"kubernetes.io/projected/b5c3560a-18be-4f65-a9f7-0dddccb36193-kube-api-access-qgqkt\") pod \"designate-operator-controller-manager-77554cdc5c-vpc6t\" (UID: \"b5c3560a-18be-4f65-a9f7-0dddccb36193\") " pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-vpc6t" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.758232 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxgp4\" (UniqueName: \"kubernetes.io/projected/bd49109f-40b2-4db9-92d7-75aaf1093a21-kube-api-access-fxgp4\") pod \"barbican-operator-controller-manager-65ff799cfd-vzbmt\" (UID: \"bd49109f-40b2-4db9-92d7-75aaf1093a21\") " pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-vzbmt" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.770043 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-575ffb885b-pxsb4"] Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.802694 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vwpnd"] Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.803651 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vwpnd" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.814329 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-sq6jg" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.818043 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-768b776ffb-4mt8c"] Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.840142 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-4mt8c" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.846379 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-5vqzk" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.862746 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlsf5\" (UniqueName: \"kubernetes.io/projected/1a0ddddf-b0e4-4bdb-bf00-c978366213a0-kube-api-access-rlsf5\") pod \"heat-operator-controller-manager-575ffb885b-pxsb4\" (UID: \"1a0ddddf-b0e4-4bdb-bf00-c978366213a0\") " pod="openstack-operators/heat-operator-controller-manager-575ffb885b-pxsb4" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.862868 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgqkt\" (UniqueName: \"kubernetes.io/projected/b5c3560a-18be-4f65-a9f7-0dddccb36193-kube-api-access-qgqkt\") pod \"designate-operator-controller-manager-77554cdc5c-vpc6t\" (UID: \"b5c3560a-18be-4f65-a9f7-0dddccb36193\") " pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-vpc6t" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.864237 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdcjj\" (UniqueName: \"kubernetes.io/projected/11a6de65-3758-4462-b2b0-9499232f8c29-kube-api-access-jdcjj\") pod \"glance-operator-controller-manager-67dd55ff59-xw2v6\" (UID: \"11a6de65-3758-4462-b2b0-9499232f8c29\") " pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-xw2v6" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.864311 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxgp4\" (UniqueName: \"kubernetes.io/projected/bd49109f-40b2-4db9-92d7-75aaf1093a21-kube-api-access-fxgp4\") pod \"barbican-operator-controller-manager-65ff799cfd-vzbmt\" (UID: \"bd49109f-40b2-4db9-92d7-75aaf1093a21\") " pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-vzbmt" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.864493 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpqnc\" (UniqueName: \"kubernetes.io/projected/79937ab5-c85f-4a4a-b35f-3b5d3711cbf0-kube-api-access-bpqnc\") pod \"cinder-operator-controller-manager-655bf9cfbb-wjdst\" (UID: \"79937ab5-c85f-4a4a-b35f-3b5d3711cbf0\") " pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-wjdst" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.864522 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz88h\" (UniqueName: \"kubernetes.io/projected/60dda427-fb0c-41c7-8ca8-9847554068f1-kube-api-access-kz88h\") pod \"horizon-operator-controller-manager-77d5c5b54f-vwpnd\" (UID: \"60dda427-fb0c-41c7-8ca8-9847554068f1\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vwpnd" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.910834 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4"] Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.911717 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.911838 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgqkt\" (UniqueName: \"kubernetes.io/projected/b5c3560a-18be-4f65-a9f7-0dddccb36193-kube-api-access-qgqkt\") pod \"designate-operator-controller-manager-77554cdc5c-vpc6t\" (UID: \"b5c3560a-18be-4f65-a9f7-0dddccb36193\") " pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-vpc6t" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.915981 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-q2499" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.916976 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.925738 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpqnc\" (UniqueName: \"kubernetes.io/projected/79937ab5-c85f-4a4a-b35f-3b5d3711cbf0-kube-api-access-bpqnc\") pod \"cinder-operator-controller-manager-655bf9cfbb-wjdst\" (UID: \"79937ab5-c85f-4a4a-b35f-3b5d3711cbf0\") " pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-wjdst" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.927098 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxgp4\" (UniqueName: \"kubernetes.io/projected/bd49109f-40b2-4db9-92d7-75aaf1093a21-kube-api-access-fxgp4\") pod \"barbican-operator-controller-manager-65ff799cfd-vzbmt\" (UID: \"bd49109f-40b2-4db9-92d7-75aaf1093a21\") " pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-vzbmt" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.964718 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-768b776ffb-4mt8c"] Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.965470 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j5tn\" (UniqueName: \"kubernetes.io/projected/f9f1432a-2977-49f8-924a-5c82c86f1de0-kube-api-access-7j5tn\") pod \"infra-operator-controller-manager-7d75bc88d5-2fbq4\" (UID: \"f9f1432a-2977-49f8-924a-5c82c86f1de0\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.965521 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jzpw\" (UniqueName: \"kubernetes.io/projected/846af064-1eb1-4384-9b88-95770199bcdc-kube-api-access-2jzpw\") pod \"ironic-operator-controller-manager-768b776ffb-4mt8c\" (UID: \"846af064-1eb1-4384-9b88-95770199bcdc\") " pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-4mt8c" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.965686 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz88h\" (UniqueName: \"kubernetes.io/projected/60dda427-fb0c-41c7-8ca8-9847554068f1-kube-api-access-kz88h\") pod \"horizon-operator-controller-manager-77d5c5b54f-vwpnd\" (UID: \"60dda427-fb0c-41c7-8ca8-9847554068f1\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vwpnd" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.965744 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlsf5\" (UniqueName: \"kubernetes.io/projected/1a0ddddf-b0e4-4bdb-bf00-c978366213a0-kube-api-access-rlsf5\") pod \"heat-operator-controller-manager-575ffb885b-pxsb4\" (UID: \"1a0ddddf-b0e4-4bdb-bf00-c978366213a0\") " pod="openstack-operators/heat-operator-controller-manager-575ffb885b-pxsb4" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.965781 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert\") pod \"infra-operator-controller-manager-7d75bc88d5-2fbq4\" (UID: \"f9f1432a-2977-49f8-924a-5c82c86f1de0\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.965835 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdcjj\" (UniqueName: \"kubernetes.io/projected/11a6de65-3758-4462-b2b0-9499232f8c29-kube-api-access-jdcjj\") pod \"glance-operator-controller-manager-67dd55ff59-xw2v6\" (UID: \"11a6de65-3758-4462-b2b0-9499232f8c29\") " pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-xw2v6" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.983443 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vwpnd"] Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.987370 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-wjdst" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.989594 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlsf5\" (UniqueName: \"kubernetes.io/projected/1a0ddddf-b0e4-4bdb-bf00-c978366213a0-kube-api-access-rlsf5\") pod \"heat-operator-controller-manager-575ffb885b-pxsb4\" (UID: \"1a0ddddf-b0e4-4bdb-bf00-c978366213a0\") " pod="openstack-operators/heat-operator-controller-manager-575ffb885b-pxsb4" Jan 28 07:04:55 crc kubenswrapper[4776]: I0128 07:04:55.994056 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-vzbmt" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.005286 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdcjj\" (UniqueName: \"kubernetes.io/projected/11a6de65-3758-4462-b2b0-9499232f8c29-kube-api-access-jdcjj\") pod \"glance-operator-controller-manager-67dd55ff59-xw2v6\" (UID: \"11a6de65-3758-4462-b2b0-9499232f8c29\") " pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-xw2v6" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.012633 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz88h\" (UniqueName: \"kubernetes.io/projected/60dda427-fb0c-41c7-8ca8-9847554068f1-kube-api-access-kz88h\") pod \"horizon-operator-controller-manager-77d5c5b54f-vwpnd\" (UID: \"60dda427-fb0c-41c7-8ca8-9847554068f1\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vwpnd" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.020987 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-vpc6t" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.030832 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.044588 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-xw2v6" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.063191 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55f684fd56-d2j6c"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.064066 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55f684fd56-d2j6c"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.064152 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-d2j6c" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.068989 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert\") pod \"infra-operator-controller-manager-7d75bc88d5-2fbq4\" (UID: \"f9f1432a-2977-49f8-924a-5c82c86f1de0\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.069096 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j5tn\" (UniqueName: \"kubernetes.io/projected/f9f1432a-2977-49f8-924a-5c82c86f1de0-kube-api-access-7j5tn\") pod \"infra-operator-controller-manager-7d75bc88d5-2fbq4\" (UID: \"f9f1432a-2977-49f8-924a-5c82c86f1de0\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.069171 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jzpw\" (UniqueName: \"kubernetes.io/projected/846af064-1eb1-4384-9b88-95770199bcdc-kube-api-access-2jzpw\") pod \"ironic-operator-controller-manager-768b776ffb-4mt8c\" (UID: \"846af064-1eb1-4384-9b88-95770199bcdc\") " pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-4mt8c" Jan 28 07:04:56 crc kubenswrapper[4776]: E0128 07:04:56.069842 4776 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 28 07:04:56 crc kubenswrapper[4776]: E0128 07:04:56.069892 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert podName:f9f1432a-2977-49f8-924a-5c82c86f1de0 nodeName:}" failed. No retries permitted until 2026-01-28 07:04:56.56987505 +0000 UTC m=+867.985535210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert") pod "infra-operator-controller-manager-7d75bc88d5-2fbq4" (UID: "f9f1432a-2977-49f8-924a-5c82c86f1de0") : secret "infra-operator-webhook-server-cert" not found Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.070528 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2ldhh" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.073395 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-849fcfbb6b-pxjl4"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.074325 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-pxjl4" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.074905 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-575ffb885b-pxsb4" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.078090 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-w7r6j" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.082154 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-lltvt"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.083108 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-lltvt" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.091033 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-j4k7h" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.092326 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j5tn\" (UniqueName: \"kubernetes.io/projected/f9f1432a-2977-49f8-924a-5c82c86f1de0-kube-api-access-7j5tn\") pod \"infra-operator-controller-manager-7d75bc88d5-2fbq4\" (UID: \"f9f1432a-2977-49f8-924a-5c82c86f1de0\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.096612 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jzpw\" (UniqueName: \"kubernetes.io/projected/846af064-1eb1-4384-9b88-95770199bcdc-kube-api-access-2jzpw\") pod \"ironic-operator-controller-manager-768b776ffb-4mt8c\" (UID: \"846af064-1eb1-4384-9b88-95770199bcdc\") " pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-4mt8c" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.104055 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-849fcfbb6b-pxjl4"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.116108 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-lltvt"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.125649 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-vrlcf"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.127025 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-vrlcf" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.130680 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-lhz8n" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.138239 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-ddcbfd695-wwhp5"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.139452 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-wwhp5" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.142060 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-vs764" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.144774 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-vrlcf"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.147274 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vwpnd" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.151599 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-94dd99d7d-gxmgb"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.152887 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-94dd99d7d-gxmgb" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.156006 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-sfv2j" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.158252 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-ddcbfd695-wwhp5"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.162584 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.165558 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.170194 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.171015 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-l4dcj" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.171915 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6b54\" (UniqueName: \"kubernetes.io/projected/93dd9036-0e5e-4817-9a6c-eb89469de01b-kube-api-access-d6b54\") pod \"mariadb-operator-controller-manager-6b9fb5fdcb-lltvt\" (UID: \"93dd9036-0e5e-4817-9a6c-eb89469de01b\") " pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-lltvt" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.171975 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnsfj\" (UniqueName: \"kubernetes.io/projected/39d3648e-5826-4e8a-b252-cb75e28651db-kube-api-access-bnsfj\") pod \"manila-operator-controller-manager-849fcfbb6b-pxjl4\" (UID: \"39d3648e-5826-4e8a-b252-cb75e28651db\") " pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-pxjl4" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.171999 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsj6q\" (UniqueName: \"kubernetes.io/projected/38646136-0a67-43c4-90ee-d88ae407d654-kube-api-access-xsj6q\") pod \"neutron-operator-controller-manager-7ffd8d76d4-vrlcf\" (UID: \"38646136-0a67-43c4-90ee-d88ae407d654\") " pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-vrlcf" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.172021 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ls2h\" (UniqueName: \"kubernetes.io/projected/6aa705d0-91c0-48eb-a5ed-ab6afb16b6f7-kube-api-access-5ls2h\") pod \"nova-operator-controller-manager-ddcbfd695-wwhp5\" (UID: \"6aa705d0-91c0-48eb-a5ed-ab6afb16b6f7\") " pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-wwhp5" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.172122 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sb6j\" (UniqueName: \"kubernetes.io/projected/bf30e81e-a5a3-4af7-9a47-673f431d3666-kube-api-access-2sb6j\") pod \"keystone-operator-controller-manager-55f684fd56-d2j6c\" (UID: \"bf30e81e-a5a3-4af7-9a47-673f431d3666\") " pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-d2j6c" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.180425 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-94dd99d7d-gxmgb"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.185224 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-4mt8c" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.190787 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f75f45d54-4xx26"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.191861 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-4xx26" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.199727 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-kqpnj" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.202077 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f75f45d54-4xx26"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.210510 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.228859 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-79d5ccc684-6kqqd"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.232174 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-8w6p2"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.232727 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8w6p2" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.233310 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6kqqd" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.238621 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-79d5ccc684-6kqqd"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.241396 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-szn2c" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.241403 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-qftmt" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.256359 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-8w6p2"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.275248 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5n54\" (UniqueName: \"kubernetes.io/projected/9390b35e-9791-4ef4-ab66-12c4662f4cdf-kube-api-access-x5n54\") pod \"placement-operator-controller-manager-79d5ccc684-6kqqd\" (UID: \"9390b35e-9791-4ef4-ab66-12c4662f4cdf\") " pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6kqqd" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.275291 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnsfj\" (UniqueName: \"kubernetes.io/projected/39d3648e-5826-4e8a-b252-cb75e28651db-kube-api-access-bnsfj\") pod \"manila-operator-controller-manager-849fcfbb6b-pxjl4\" (UID: \"39d3648e-5826-4e8a-b252-cb75e28651db\") " pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-pxjl4" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.275316 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsj6q\" (UniqueName: \"kubernetes.io/projected/38646136-0a67-43c4-90ee-d88ae407d654-kube-api-access-xsj6q\") pod \"neutron-operator-controller-manager-7ffd8d76d4-vrlcf\" (UID: \"38646136-0a67-43c4-90ee-d88ae407d654\") " pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-vrlcf" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.275335 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvvqz\" (UniqueName: \"kubernetes.io/projected/6a91b170-b0ed-4156-a9ee-74efca2560e7-kube-api-access-dvvqz\") pod \"ovn-operator-controller-manager-6f75f45d54-4xx26\" (UID: \"6a91b170-b0ed-4156-a9ee-74efca2560e7\") " pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-4xx26" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.275356 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ls2h\" (UniqueName: \"kubernetes.io/projected/6aa705d0-91c0-48eb-a5ed-ab6afb16b6f7-kube-api-access-5ls2h\") pod \"nova-operator-controller-manager-ddcbfd695-wwhp5\" (UID: \"6aa705d0-91c0-48eb-a5ed-ab6afb16b6f7\") " pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-wwhp5" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.278890 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hbb56\" (UID: \"9eae60fd-6135-4e41-bb77-e3caae71237d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.278927 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d9qg\" (UniqueName: \"kubernetes.io/projected/a4407ded-de50-4ae5-bf84-2d6a3baa565c-kube-api-access-9d9qg\") pod \"octavia-operator-controller-manager-94dd99d7d-gxmgb\" (UID: \"a4407ded-de50-4ae5-bf84-2d6a3baa565c\") " pod="openstack-operators/octavia-operator-controller-manager-94dd99d7d-gxmgb" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.278997 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sb6j\" (UniqueName: \"kubernetes.io/projected/bf30e81e-a5a3-4af7-9a47-673f431d3666-kube-api-access-2sb6j\") pod \"keystone-operator-controller-manager-55f684fd56-d2j6c\" (UID: \"bf30e81e-a5a3-4af7-9a47-673f431d3666\") " pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-d2j6c" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.279028 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktmvd\" (UniqueName: \"kubernetes.io/projected/5bc7efb1-0792-40f2-993a-eb865919048c-kube-api-access-ktmvd\") pod \"swift-operator-controller-manager-547cbdb99f-8w6p2\" (UID: \"5bc7efb1-0792-40f2-993a-eb865919048c\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8w6p2" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.279079 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s885\" (UniqueName: \"kubernetes.io/projected/9eae60fd-6135-4e41-bb77-e3caae71237d-kube-api-access-8s885\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hbb56\" (UID: \"9eae60fd-6135-4e41-bb77-e3caae71237d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.279113 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6b54\" (UniqueName: \"kubernetes.io/projected/93dd9036-0e5e-4817-9a6c-eb89469de01b-kube-api-access-d6b54\") pod \"mariadb-operator-controller-manager-6b9fb5fdcb-lltvt\" (UID: \"93dd9036-0e5e-4817-9a6c-eb89469de01b\") " pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-lltvt" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.280762 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-799bc87c89-jfcj6"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.281560 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-799bc87c89-jfcj6"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.281637 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-jfcj6" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.283631 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-z5d4n" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.311569 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ls2h\" (UniqueName: \"kubernetes.io/projected/6aa705d0-91c0-48eb-a5ed-ab6afb16b6f7-kube-api-access-5ls2h\") pod \"nova-operator-controller-manager-ddcbfd695-wwhp5\" (UID: \"6aa705d0-91c0-48eb-a5ed-ab6afb16b6f7\") " pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-wwhp5" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.317639 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-8dl97"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.318612 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-8dl97" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.322355 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsj6q\" (UniqueName: \"kubernetes.io/projected/38646136-0a67-43c4-90ee-d88ae407d654-kube-api-access-xsj6q\") pod \"neutron-operator-controller-manager-7ffd8d76d4-vrlcf\" (UID: \"38646136-0a67-43c4-90ee-d88ae407d654\") " pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-vrlcf" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.349888 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-zltzf" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.353187 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sb6j\" (UniqueName: \"kubernetes.io/projected/bf30e81e-a5a3-4af7-9a47-673f431d3666-kube-api-access-2sb6j\") pod \"keystone-operator-controller-manager-55f684fd56-d2j6c\" (UID: \"bf30e81e-a5a3-4af7-9a47-673f431d3666\") " pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-d2j6c" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.366156 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnsfj\" (UniqueName: \"kubernetes.io/projected/39d3648e-5826-4e8a-b252-cb75e28651db-kube-api-access-bnsfj\") pod \"manila-operator-controller-manager-849fcfbb6b-pxjl4\" (UID: \"39d3648e-5826-4e8a-b252-cb75e28651db\") " pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-pxjl4" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.383211 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvvqz\" (UniqueName: \"kubernetes.io/projected/6a91b170-b0ed-4156-a9ee-74efca2560e7-kube-api-access-dvvqz\") pod \"ovn-operator-controller-manager-6f75f45d54-4xx26\" (UID: \"6a91b170-b0ed-4156-a9ee-74efca2560e7\") " pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-4xx26" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.383467 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hbb56\" (UID: \"9eae60fd-6135-4e41-bb77-e3caae71237d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.383695 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d9qg\" (UniqueName: \"kubernetes.io/projected/a4407ded-de50-4ae5-bf84-2d6a3baa565c-kube-api-access-9d9qg\") pod \"octavia-operator-controller-manager-94dd99d7d-gxmgb\" (UID: \"a4407ded-de50-4ae5-bf84-2d6a3baa565c\") " pod="openstack-operators/octavia-operator-controller-manager-94dd99d7d-gxmgb" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.383732 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmj87\" (UniqueName: \"kubernetes.io/projected/2f79777a-6f48-42d4-b39e-4393e932aea0-kube-api-access-hmj87\") pod \"test-operator-controller-manager-69797bbcbd-8dl97\" (UID: \"2f79777a-6f48-42d4-b39e-4393e932aea0\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-8dl97" Jan 28 07:04:56 crc kubenswrapper[4776]: E0128 07:04:56.383906 4776 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 07:04:56 crc kubenswrapper[4776]: E0128 07:04:56.383972 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert podName:9eae60fd-6135-4e41-bb77-e3caae71237d nodeName:}" failed. No retries permitted until 2026-01-28 07:04:56.88395203 +0000 UTC m=+868.299612190 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" (UID: "9eae60fd-6135-4e41-bb77-e3caae71237d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.384412 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktmvd\" (UniqueName: \"kubernetes.io/projected/5bc7efb1-0792-40f2-993a-eb865919048c-kube-api-access-ktmvd\") pod \"swift-operator-controller-manager-547cbdb99f-8w6p2\" (UID: \"5bc7efb1-0792-40f2-993a-eb865919048c\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8w6p2" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.384459 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxh8f\" (UniqueName: \"kubernetes.io/projected/6f563213-8471-44f5-83aa-820e73ed7746-kube-api-access-qxh8f\") pod \"telemetry-operator-controller-manager-799bc87c89-jfcj6\" (UID: \"6f563213-8471-44f5-83aa-820e73ed7746\") " pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-jfcj6" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.384504 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s885\" (UniqueName: \"kubernetes.io/projected/9eae60fd-6135-4e41-bb77-e3caae71237d-kube-api-access-8s885\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hbb56\" (UID: \"9eae60fd-6135-4e41-bb77-e3caae71237d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.385198 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5n54\" (UniqueName: \"kubernetes.io/projected/9390b35e-9791-4ef4-ab66-12c4662f4cdf-kube-api-access-x5n54\") pod \"placement-operator-controller-manager-79d5ccc684-6kqqd\" (UID: \"9390b35e-9791-4ef4-ab66-12c4662f4cdf\") " pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6kqqd" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.393879 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-d2j6c" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.418662 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-pxjl4" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.425983 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6b54\" (UniqueName: \"kubernetes.io/projected/93dd9036-0e5e-4817-9a6c-eb89469de01b-kube-api-access-d6b54\") pod \"mariadb-operator-controller-manager-6b9fb5fdcb-lltvt\" (UID: \"93dd9036-0e5e-4817-9a6c-eb89469de01b\") " pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-lltvt" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.426699 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-8dl97"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.427187 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-lltvt" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.428369 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5n54\" (UniqueName: \"kubernetes.io/projected/9390b35e-9791-4ef4-ab66-12c4662f4cdf-kube-api-access-x5n54\") pod \"placement-operator-controller-manager-79d5ccc684-6kqqd\" (UID: \"9390b35e-9791-4ef4-ab66-12c4662f4cdf\") " pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6kqqd" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.457519 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-vrlcf" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.472857 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-wwhp5" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.475497 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-66fbd46fdf-dpq5g"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.478659 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-66fbd46fdf-dpq5g" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.480903 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d9qg\" (UniqueName: \"kubernetes.io/projected/a4407ded-de50-4ae5-bf84-2d6a3baa565c-kube-api-access-9d9qg\") pod \"octavia-operator-controller-manager-94dd99d7d-gxmgb\" (UID: \"a4407ded-de50-4ae5-bf84-2d6a3baa565c\") " pod="openstack-operators/octavia-operator-controller-manager-94dd99d7d-gxmgb" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.482508 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s885\" (UniqueName: \"kubernetes.io/projected/9eae60fd-6135-4e41-bb77-e3caae71237d-kube-api-access-8s885\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hbb56\" (UID: \"9eae60fd-6135-4e41-bb77-e3caae71237d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.483514 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-zhdrm" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.486453 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmj87\" (UniqueName: \"kubernetes.io/projected/2f79777a-6f48-42d4-b39e-4393e932aea0-kube-api-access-hmj87\") pod \"test-operator-controller-manager-69797bbcbd-8dl97\" (UID: \"2f79777a-6f48-42d4-b39e-4393e932aea0\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-8dl97" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.486511 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxh8f\" (UniqueName: \"kubernetes.io/projected/6f563213-8471-44f5-83aa-820e73ed7746-kube-api-access-qxh8f\") pod \"telemetry-operator-controller-manager-799bc87c89-jfcj6\" (UID: \"6f563213-8471-44f5-83aa-820e73ed7746\") " pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-jfcj6" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.488159 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-66fbd46fdf-dpq5g"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.489147 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-94dd99d7d-gxmgb" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.506016 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.506113 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvvqz\" (UniqueName: \"kubernetes.io/projected/6a91b170-b0ed-4156-a9ee-74efca2560e7-kube-api-access-dvvqz\") pod \"ovn-operator-controller-manager-6f75f45d54-4xx26\" (UID: \"6a91b170-b0ed-4156-a9ee-74efca2560e7\") " pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-4xx26" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.506692 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktmvd\" (UniqueName: \"kubernetes.io/projected/5bc7efb1-0792-40f2-993a-eb865919048c-kube-api-access-ktmvd\") pod \"swift-operator-controller-manager-547cbdb99f-8w6p2\" (UID: \"5bc7efb1-0792-40f2-993a-eb865919048c\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8w6p2" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.506911 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.512398 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.512541 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.512731 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-s9dpc" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.524596 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.555898 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-4xx26" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.568607 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmj87\" (UniqueName: \"kubernetes.io/projected/2f79777a-6f48-42d4-b39e-4393e932aea0-kube-api-access-hmj87\") pod \"test-operator-controller-manager-69797bbcbd-8dl97\" (UID: \"2f79777a-6f48-42d4-b39e-4393e932aea0\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-8dl97" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.580569 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxh8f\" (UniqueName: \"kubernetes.io/projected/6f563213-8471-44f5-83aa-820e73ed7746-kube-api-access-qxh8f\") pod \"telemetry-operator-controller-manager-799bc87c89-jfcj6\" (UID: \"6f563213-8471-44f5-83aa-820e73ed7746\") " pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-jfcj6" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.582014 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8w6p2" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.587685 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert\") pod \"infra-operator-controller-manager-7d75bc88d5-2fbq4\" (UID: \"f9f1432a-2977-49f8-924a-5c82c86f1de0\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.587871 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.588517 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.588579 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2rz6\" (UniqueName: \"kubernetes.io/projected/70aa7185-ded8-4807-822c-69fc5b03feeb-kube-api-access-j2rz6\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.588670 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf2b5\" (UniqueName: \"kubernetes.io/projected/3b6f6ae6-4641-4dd2-9021-197e9ea97b2b-kube-api-access-pf2b5\") pod \"watcher-operator-controller-manager-66fbd46fdf-dpq5g\" (UID: \"3b6f6ae6-4641-4dd2-9021-197e9ea97b2b\") " pod="openstack-operators/watcher-operator-controller-manager-66fbd46fdf-dpq5g" Jan 28 07:04:56 crc kubenswrapper[4776]: E0128 07:04:56.588831 4776 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 28 07:04:56 crc kubenswrapper[4776]: E0128 07:04:56.588874 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert podName:f9f1432a-2977-49f8-924a-5c82c86f1de0 nodeName:}" failed. No retries permitted until 2026-01-28 07:04:57.588860741 +0000 UTC m=+869.004520901 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert") pod "infra-operator-controller-manager-7d75bc88d5-2fbq4" (UID: "f9f1432a-2977-49f8-924a-5c82c86f1de0") : secret "infra-operator-webhook-server-cert" not found Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.605296 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8wdwl"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.606126 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8wdwl" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.609426 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-rhqcg" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.613478 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8wdwl"] Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.623578 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6kqqd" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.637722 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-jfcj6" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.656327 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-8dl97" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.689475 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9qv6\" (UniqueName: \"kubernetes.io/projected/22f3f762-cc29-4a18-8bfc-430b85e041cc-kube-api-access-f9qv6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-8wdwl\" (UID: \"22f3f762-cc29-4a18-8bfc-430b85e041cc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8wdwl" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.689580 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.689648 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.689683 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2rz6\" (UniqueName: \"kubernetes.io/projected/70aa7185-ded8-4807-822c-69fc5b03feeb-kube-api-access-j2rz6\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.689723 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf2b5\" (UniqueName: \"kubernetes.io/projected/3b6f6ae6-4641-4dd2-9021-197e9ea97b2b-kube-api-access-pf2b5\") pod \"watcher-operator-controller-manager-66fbd46fdf-dpq5g\" (UID: \"3b6f6ae6-4641-4dd2-9021-197e9ea97b2b\") " pod="openstack-operators/watcher-operator-controller-manager-66fbd46fdf-dpq5g" Jan 28 07:04:56 crc kubenswrapper[4776]: E0128 07:04:56.690603 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 28 07:04:56 crc kubenswrapper[4776]: E0128 07:04:56.690658 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs podName:70aa7185-ded8-4807-822c-69fc5b03feeb nodeName:}" failed. No retries permitted until 2026-01-28 07:04:57.190643088 +0000 UTC m=+868.606303238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs") pod "openstack-operator-controller-manager-6fdbb46688-tnzx5" (UID: "70aa7185-ded8-4807-822c-69fc5b03feeb") : secret "metrics-server-cert" not found Jan 28 07:04:56 crc kubenswrapper[4776]: E0128 07:04:56.690765 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 28 07:04:56 crc kubenswrapper[4776]: E0128 07:04:56.690832 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs podName:70aa7185-ded8-4807-822c-69fc5b03feeb nodeName:}" failed. No retries permitted until 2026-01-28 07:04:57.190814842 +0000 UTC m=+868.606475002 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs") pod "openstack-operator-controller-manager-6fdbb46688-tnzx5" (UID: "70aa7185-ded8-4807-822c-69fc5b03feeb") : secret "webhook-server-cert" not found Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.714173 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf2b5\" (UniqueName: \"kubernetes.io/projected/3b6f6ae6-4641-4dd2-9021-197e9ea97b2b-kube-api-access-pf2b5\") pod \"watcher-operator-controller-manager-66fbd46fdf-dpq5g\" (UID: \"3b6f6ae6-4641-4dd2-9021-197e9ea97b2b\") " pod="openstack-operators/watcher-operator-controller-manager-66fbd46fdf-dpq5g" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.714787 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2rz6\" (UniqueName: \"kubernetes.io/projected/70aa7185-ded8-4807-822c-69fc5b03feeb-kube-api-access-j2rz6\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.791827 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9qv6\" (UniqueName: \"kubernetes.io/projected/22f3f762-cc29-4a18-8bfc-430b85e041cc-kube-api-access-f9qv6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-8wdwl\" (UID: \"22f3f762-cc29-4a18-8bfc-430b85e041cc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8wdwl" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.819567 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9qv6\" (UniqueName: \"kubernetes.io/projected/22f3f762-cc29-4a18-8bfc-430b85e041cc-kube-api-access-f9qv6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-8wdwl\" (UID: \"22f3f762-cc29-4a18-8bfc-430b85e041cc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8wdwl" Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.892862 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hbb56\" (UID: \"9eae60fd-6135-4e41-bb77-e3caae71237d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" Jan 28 07:04:56 crc kubenswrapper[4776]: E0128 07:04:56.893046 4776 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 07:04:56 crc kubenswrapper[4776]: E0128 07:04:56.893103 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert podName:9eae60fd-6135-4e41-bb77-e3caae71237d nodeName:}" failed. No retries permitted until 2026-01-28 07:04:57.893086661 +0000 UTC m=+869.308746821 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" (UID: "9eae60fd-6135-4e41-bb77-e3caae71237d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 07:04:56 crc kubenswrapper[4776]: I0128 07:04:56.977356 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-66fbd46fdf-dpq5g" Jan 28 07:04:57 crc kubenswrapper[4776]: I0128 07:04:57.075404 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8wdwl" Jan 28 07:04:57 crc kubenswrapper[4776]: I0128 07:04:57.199619 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:04:57 crc kubenswrapper[4776]: I0128 07:04:57.199668 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:04:57 crc kubenswrapper[4776]: E0128 07:04:57.199804 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 28 07:04:57 crc kubenswrapper[4776]: E0128 07:04:57.199851 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs podName:70aa7185-ded8-4807-822c-69fc5b03feeb nodeName:}" failed. No retries permitted until 2026-01-28 07:04:58.199837361 +0000 UTC m=+869.615497521 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs") pod "openstack-operator-controller-manager-6fdbb46688-tnzx5" (UID: "70aa7185-ded8-4807-822c-69fc5b03feeb") : secret "webhook-server-cert" not found Jan 28 07:04:57 crc kubenswrapper[4776]: E0128 07:04:57.200200 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 28 07:04:57 crc kubenswrapper[4776]: E0128 07:04:57.200229 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs podName:70aa7185-ded8-4807-822c-69fc5b03feeb nodeName:}" failed. No retries permitted until 2026-01-28 07:04:58.200222561 +0000 UTC m=+869.615882721 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs") pod "openstack-operator-controller-manager-6fdbb46688-tnzx5" (UID: "70aa7185-ded8-4807-822c-69fc5b03feeb") : secret "metrics-server-cert" not found Jan 28 07:04:57 crc kubenswrapper[4776]: I0128 07:04:57.509619 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-655bf9cfbb-wjdst"] Jan 28 07:04:57 crc kubenswrapper[4776]: I0128 07:04:57.513869 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-65ff799cfd-vzbmt"] Jan 28 07:04:57 crc kubenswrapper[4776]: I0128 07:04:57.604680 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert\") pod \"infra-operator-controller-manager-7d75bc88d5-2fbq4\" (UID: \"f9f1432a-2977-49f8-924a-5c82c86f1de0\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" Jan 28 07:04:57 crc kubenswrapper[4776]: E0128 07:04:57.604887 4776 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 28 07:04:57 crc kubenswrapper[4776]: E0128 07:04:57.604940 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert podName:f9f1432a-2977-49f8-924a-5c82c86f1de0 nodeName:}" failed. No retries permitted until 2026-01-28 07:04:59.604926712 +0000 UTC m=+871.020586872 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert") pod "infra-operator-controller-manager-7d75bc88d5-2fbq4" (UID: "f9f1432a-2977-49f8-924a-5c82c86f1de0") : secret "infra-operator-webhook-server-cert" not found Jan 28 07:04:57 crc kubenswrapper[4776]: I0128 07:04:57.608469 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-vzbmt" event={"ID":"bd49109f-40b2-4db9-92d7-75aaf1093a21","Type":"ContainerStarted","Data":"94fac868ce1fcef42a6762134a54248756a2ab1f502e716908e53b9792e6e32b"} Jan 28 07:04:57 crc kubenswrapper[4776]: I0128 07:04:57.609506 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-wjdst" event={"ID":"79937ab5-c85f-4a4a-b35f-3b5d3711cbf0","Type":"ContainerStarted","Data":"5caeb9b2ce3d60ffbe43a7fc3849cbd7efd07b59f39da9403b7d0143093d2f07"} Jan 28 07:04:57 crc kubenswrapper[4776]: I0128 07:04:57.911144 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hbb56\" (UID: \"9eae60fd-6135-4e41-bb77-e3caae71237d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" Jan 28 07:04:57 crc kubenswrapper[4776]: E0128 07:04:57.911407 4776 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 07:04:57 crc kubenswrapper[4776]: E0128 07:04:57.911470 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert podName:9eae60fd-6135-4e41-bb77-e3caae71237d nodeName:}" failed. No retries permitted until 2026-01-28 07:04:59.911452676 +0000 UTC m=+871.327112836 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" (UID: "9eae60fd-6135-4e41-bb77-e3caae71237d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 07:04:57 crc kubenswrapper[4776]: I0128 07:04:57.951630 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-575ffb885b-pxsb4"] Jan 28 07:04:57 crc kubenswrapper[4776]: I0128 07:04:57.960682 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-ddcbfd695-wwhp5"] Jan 28 07:04:57 crc kubenswrapper[4776]: I0128 07:04:57.969257 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-768b776ffb-4mt8c"] Jan 28 07:04:57 crc kubenswrapper[4776]: I0128 07:04:57.973513 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-lltvt"] Jan 28 07:04:57 crc kubenswrapper[4776]: I0128 07:04:57.978716 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-849fcfbb6b-pxjl4"] Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.006182 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55f684fd56-d2j6c"] Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.054027 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vwpnd"] Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.087893 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-8dl97"] Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.096350 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/designate-operator@sha256:d26a32730ba8b64e98f68194bd1a766aadc942392b24fa6a2cf1c136969dd99f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qgqkt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-77554cdc5c-vpc6t_openstack-operators(b5c3560a-18be-4f65-a9f7-0dddccb36193): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.098349 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-vpc6t" podUID="b5c3560a-18be-4f65-a9f7-0dddccb36193" Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.107172 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-77554cdc5c-vpc6t"] Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.115246 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/telemetry-operator@sha256:1f1fea3b7df89b81756eab8e6f4c9bed01ab7e949a6ce2d7692c260f41dfbc20,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qxh8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-799bc87c89-jfcj6_openstack-operators(6f563213-8471-44f5-83aa-820e73ed7746): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.115513 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/neutron-operator@sha256:14786c3a66c41213a03d6375c03209f22d439dd6e752317ddcbe21dda66bb569,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xsj6q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-7ffd8d76d4-vrlcf_openstack-operators(38646136-0a67-43c4-90ee-d88ae407d654): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.116630 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-jfcj6" podUID="6f563213-8471-44f5-83aa-820e73ed7746" Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.116709 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-vrlcf" podUID="38646136-0a67-43c4-90ee-d88ae407d654" Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.120716 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f75f45d54-4xx26"] Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.127710 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.193:5001/openstack-k8s-operators/watcher-operator:8b30c5e7d456cae7da7365ebd2c0546fea54112e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pf2b5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-66fbd46fdf-dpq5g_openstack-operators(3b6f6ae6-4641-4dd2-9021-197e9ea97b2b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.134832 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-66fbd46fdf-dpq5g" podUID="3b6f6ae6-4641-4dd2-9021-197e9ea97b2b" Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.134907 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-8w6p2"] Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.139520 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-vrlcf"] Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.145677 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-799bc87c89-jfcj6"] Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.148598 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-66fbd46fdf-dpq5g"] Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.154886 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-94dd99d7d-gxmgb"] Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.163776 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-79d5ccc684-6kqqd"] Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.168379 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-67dd55ff59-xw2v6"] Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.174144 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/octavia-operator@sha256:e75170b995315ff39a52c1ee42fa65486f828a81bad9248710d3361c9847800d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9d9qg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-94dd99d7d-gxmgb_openstack-operators(a4407ded-de50-4ae5-bf84-2d6a3baa565c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.175596 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-94dd99d7d-gxmgb" podUID="a4407ded-de50-4ae5-bf84-2d6a3baa565c" Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.184259 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8wdwl"] Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.188126 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/glance-operator@sha256:bc45409dff26aca6bd982684cfaf093548adb6a71928f5257fe60ab5535dda39,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jdcjj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-67dd55ff59-xw2v6_openstack-operators(11a6de65-3758-4462-b2b0-9499232f8c29): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.189209 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-xw2v6" podUID="11a6de65-3758-4462-b2b0-9499232f8c29" Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.189489 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:013c0ad82d21a21c7eece5cd4b5d5c4b8eb410b6671ac33a6f3fb78c8510811d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x5n54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-79d5ccc684-6kqqd_openstack-operators(9390b35e-9791-4ef4-ab66-12c4662f4cdf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.191219 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6kqqd" podUID="9390b35e-9791-4ef4-ab66-12c4662f4cdf" Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.206093 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f9qv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-8wdwl_openstack-operators(22f3f762-cc29-4a18-8bfc-430b85e041cc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.207684 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8wdwl" podUID="22f3f762-cc29-4a18-8bfc-430b85e041cc" Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.225050 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.225107 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.225314 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.225361 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs podName:70aa7185-ded8-4807-822c-69fc5b03feeb nodeName:}" failed. No retries permitted until 2026-01-28 07:05:00.225348221 +0000 UTC m=+871.641008381 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs") pod "openstack-operator-controller-manager-6fdbb46688-tnzx5" (UID: "70aa7185-ded8-4807-822c-69fc5b03feeb") : secret "webhook-server-cert" not found Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.225482 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.225505 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs podName:70aa7185-ded8-4807-822c-69fc5b03feeb nodeName:}" failed. No retries permitted until 2026-01-28 07:05:00.225498836 +0000 UTC m=+871.641158996 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs") pod "openstack-operator-controller-manager-6fdbb46688-tnzx5" (UID: "70aa7185-ded8-4807-822c-69fc5b03feeb") : secret "metrics-server-cert" not found Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.629672 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-8dl97" event={"ID":"2f79777a-6f48-42d4-b39e-4393e932aea0","Type":"ContainerStarted","Data":"8975a95bd95b5f62bc8fa1d3181ff8bd696bc20adf1c2ace69f5c06de9053b3f"} Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.638366 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-575ffb885b-pxsb4" event={"ID":"1a0ddddf-b0e4-4bdb-bf00-c978366213a0","Type":"ContainerStarted","Data":"045eb60f08f67608d1151be28553150a21801143d095c8e9a603432a9324f748"} Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.639740 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6kqqd" event={"ID":"9390b35e-9791-4ef4-ab66-12c4662f4cdf","Type":"ContainerStarted","Data":"4e6ff76d2afba256f182a29f0705fe0c5125296517b0d86bd0f35f3dc340f085"} Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.641065 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:013c0ad82d21a21c7eece5cd4b5d5c4b8eb410b6671ac33a6f3fb78c8510811d\\\"\"" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6kqqd" podUID="9390b35e-9791-4ef4-ab66-12c4662f4cdf" Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.641497 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-xw2v6" event={"ID":"11a6de65-3758-4462-b2b0-9499232f8c29","Type":"ContainerStarted","Data":"e2ff64eaf6dc4c6cb7053393b30ed46e45798271b5897a2bf3101ee7b0b4a997"} Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.642861 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-66fbd46fdf-dpq5g" event={"ID":"3b6f6ae6-4641-4dd2-9021-197e9ea97b2b","Type":"ContainerStarted","Data":"808361553d3ce8d8c13748880cb90802805e56f27683bedfbc0f5f4e31b167dc"} Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.642860 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/glance-operator@sha256:bc45409dff26aca6bd982684cfaf093548adb6a71928f5257fe60ab5535dda39\\\"\"" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-xw2v6" podUID="11a6de65-3758-4462-b2b0-9499232f8c29" Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.644202 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.193:5001/openstack-k8s-operators/watcher-operator:8b30c5e7d456cae7da7365ebd2c0546fea54112e\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-66fbd46fdf-dpq5g" podUID="3b6f6ae6-4641-4dd2-9021-197e9ea97b2b" Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.651811 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-vrlcf" event={"ID":"38646136-0a67-43c4-90ee-d88ae407d654","Type":"ContainerStarted","Data":"55e2d7523188becfd2ca67340aa72a21223ba0c094ae45e3e2c15b0ab154a9bb"} Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.661020 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/neutron-operator@sha256:14786c3a66c41213a03d6375c03209f22d439dd6e752317ddcbe21dda66bb569\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-vrlcf" podUID="38646136-0a67-43c4-90ee-d88ae407d654" Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.661619 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-wwhp5" event={"ID":"6aa705d0-91c0-48eb-a5ed-ab6afb16b6f7","Type":"ContainerStarted","Data":"3745f513c67817d1b5a57fe75ad63aa9f6a8bd58a8fd7da06ec07cc9a75f34ce"} Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.667033 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-lltvt" event={"ID":"93dd9036-0e5e-4817-9a6c-eb89469de01b","Type":"ContainerStarted","Data":"24171311b58be0016fc2594a1351b52ee18629eed9dbc59b904fcc850d086cba"} Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.674280 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-pxjl4" event={"ID":"39d3648e-5826-4e8a-b252-cb75e28651db","Type":"ContainerStarted","Data":"8c479299dfe2650fede76e5bf22727ad711fae2ccb02e46a5ac0355d6e11f7f8"} Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.701517 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-vpc6t" event={"ID":"b5c3560a-18be-4f65-a9f7-0dddccb36193","Type":"ContainerStarted","Data":"8f9d7129d966fe913acf741dc8340c33a26e9969e8e984f3fc0183763f557d55"} Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.717351 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/designate-operator@sha256:d26a32730ba8b64e98f68194bd1a766aadc942392b24fa6a2cf1c136969dd99f\\\"\"" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-vpc6t" podUID="b5c3560a-18be-4f65-a9f7-0dddccb36193" Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.730216 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vwpnd" event={"ID":"60dda427-fb0c-41c7-8ca8-9847554068f1","Type":"ContainerStarted","Data":"ff188a2ede358cf08a17969ce889775a2b4fbdb719ea8a71e7edf22de46757bd"} Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.731940 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8wdwl" event={"ID":"22f3f762-cc29-4a18-8bfc-430b85e041cc","Type":"ContainerStarted","Data":"569214b498fd718fd800c7b1baac3bd90cf5501f6375c82c9383ef7ad91e2c49"} Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.733411 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8wdwl" podUID="22f3f762-cc29-4a18-8bfc-430b85e041cc" Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.734684 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-4xx26" event={"ID":"6a91b170-b0ed-4156-a9ee-74efca2560e7","Type":"ContainerStarted","Data":"4d73d4ee4e4395a6f9a1061922a77dbad49c712bc106b58d10476d854080be71"} Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.735533 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-jfcj6" event={"ID":"6f563213-8471-44f5-83aa-820e73ed7746","Type":"ContainerStarted","Data":"b5a7a41c1cba0b4298379152224d373179050d7886e4e040f17785ea8818d1e2"} Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.737739 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/telemetry-operator@sha256:1f1fea3b7df89b81756eab8e6f4c9bed01ab7e949a6ce2d7692c260f41dfbc20\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-jfcj6" podUID="6f563213-8471-44f5-83aa-820e73ed7746" Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.737958 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-94dd99d7d-gxmgb" event={"ID":"a4407ded-de50-4ae5-bf84-2d6a3baa565c","Type":"ContainerStarted","Data":"9b230d8a4591a2f7de90c1017f9b979412736a1b672b75572b25ae2ef35648e6"} Jan 28 07:04:58 crc kubenswrapper[4776]: E0128 07:04:58.738743 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/octavia-operator@sha256:e75170b995315ff39a52c1ee42fa65486f828a81bad9248710d3361c9847800d\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-94dd99d7d-gxmgb" podUID="a4407ded-de50-4ae5-bf84-2d6a3baa565c" Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.739259 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-4mt8c" event={"ID":"846af064-1eb1-4384-9b88-95770199bcdc","Type":"ContainerStarted","Data":"a02fa9adef6b8fbe4c73c5c68860740c73831f1be40640597a3ea69c49c0d7f1"} Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.740537 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-d2j6c" event={"ID":"bf30e81e-a5a3-4af7-9a47-673f431d3666","Type":"ContainerStarted","Data":"fda6432be81fd10a986a7afc066a81e39e6172c8b6a4e36708f001616a3ed90a"} Jan 28 07:04:58 crc kubenswrapper[4776]: I0128 07:04:58.767722 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8w6p2" event={"ID":"5bc7efb1-0792-40f2-993a-eb865919048c","Type":"ContainerStarted","Data":"739c40f37b08bfa53c8addbd26abe09e8a58b16f5f925c17294ad83d85dbd2be"} Jan 28 07:04:59 crc kubenswrapper[4776]: I0128 07:04:59.664891 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert\") pod \"infra-operator-controller-manager-7d75bc88d5-2fbq4\" (UID: \"f9f1432a-2977-49f8-924a-5c82c86f1de0\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" Jan 28 07:04:59 crc kubenswrapper[4776]: E0128 07:04:59.665061 4776 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 28 07:04:59 crc kubenswrapper[4776]: E0128 07:04:59.665105 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert podName:f9f1432a-2977-49f8-924a-5c82c86f1de0 nodeName:}" failed. No retries permitted until 2026-01-28 07:05:03.665092076 +0000 UTC m=+875.080752236 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert") pod "infra-operator-controller-manager-7d75bc88d5-2fbq4" (UID: "f9f1432a-2977-49f8-924a-5c82c86f1de0") : secret "infra-operator-webhook-server-cert" not found Jan 28 07:04:59 crc kubenswrapper[4776]: E0128 07:04:59.806982 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/glance-operator@sha256:bc45409dff26aca6bd982684cfaf093548adb6a71928f5257fe60ab5535dda39\\\"\"" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-xw2v6" podUID="11a6de65-3758-4462-b2b0-9499232f8c29" Jan 28 07:04:59 crc kubenswrapper[4776]: E0128 07:04:59.807008 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/octavia-operator@sha256:e75170b995315ff39a52c1ee42fa65486f828a81bad9248710d3361c9847800d\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-94dd99d7d-gxmgb" podUID="a4407ded-de50-4ae5-bf84-2d6a3baa565c" Jan 28 07:04:59 crc kubenswrapper[4776]: E0128 07:04:59.807064 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:013c0ad82d21a21c7eece5cd4b5d5c4b8eb410b6671ac33a6f3fb78c8510811d\\\"\"" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6kqqd" podUID="9390b35e-9791-4ef4-ab66-12c4662f4cdf" Jan 28 07:04:59 crc kubenswrapper[4776]: E0128 07:04:59.807071 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/designate-operator@sha256:d26a32730ba8b64e98f68194bd1a766aadc942392b24fa6a2cf1c136969dd99f\\\"\"" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-vpc6t" podUID="b5c3560a-18be-4f65-a9f7-0dddccb36193" Jan 28 07:04:59 crc kubenswrapper[4776]: E0128 07:04:59.807037 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/telemetry-operator@sha256:1f1fea3b7df89b81756eab8e6f4c9bed01ab7e949a6ce2d7692c260f41dfbc20\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-jfcj6" podUID="6f563213-8471-44f5-83aa-820e73ed7746" Jan 28 07:04:59 crc kubenswrapper[4776]: E0128 07:04:59.807097 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.193:5001/openstack-k8s-operators/watcher-operator:8b30c5e7d456cae7da7365ebd2c0546fea54112e\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-66fbd46fdf-dpq5g" podUID="3b6f6ae6-4641-4dd2-9021-197e9ea97b2b" Jan 28 07:04:59 crc kubenswrapper[4776]: E0128 07:04:59.807104 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/neutron-operator@sha256:14786c3a66c41213a03d6375c03209f22d439dd6e752317ddcbe21dda66bb569\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-vrlcf" podUID="38646136-0a67-43c4-90ee-d88ae407d654" Jan 28 07:04:59 crc kubenswrapper[4776]: E0128 07:04:59.807164 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8wdwl" podUID="22f3f762-cc29-4a18-8bfc-430b85e041cc" Jan 28 07:04:59 crc kubenswrapper[4776]: I0128 07:04:59.969254 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hbb56\" (UID: \"9eae60fd-6135-4e41-bb77-e3caae71237d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" Jan 28 07:04:59 crc kubenswrapper[4776]: E0128 07:04:59.969497 4776 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 07:04:59 crc kubenswrapper[4776]: E0128 07:04:59.969575 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert podName:9eae60fd-6135-4e41-bb77-e3caae71237d nodeName:}" failed. No retries permitted until 2026-01-28 07:05:03.969527272 +0000 UTC m=+875.385187432 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" (UID: "9eae60fd-6135-4e41-bb77-e3caae71237d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 07:05:00 crc kubenswrapper[4776]: I0128 07:05:00.274623 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:05:00 crc kubenswrapper[4776]: I0128 07:05:00.274935 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:05:00 crc kubenswrapper[4776]: E0128 07:05:00.274808 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 28 07:05:00 crc kubenswrapper[4776]: E0128 07:05:00.275052 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs podName:70aa7185-ded8-4807-822c-69fc5b03feeb nodeName:}" failed. No retries permitted until 2026-01-28 07:05:04.275032917 +0000 UTC m=+875.690693077 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs") pod "openstack-operator-controller-manager-6fdbb46688-tnzx5" (UID: "70aa7185-ded8-4807-822c-69fc5b03feeb") : secret "metrics-server-cert" not found Jan 28 07:05:00 crc kubenswrapper[4776]: E0128 07:05:00.275064 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 28 07:05:00 crc kubenswrapper[4776]: E0128 07:05:00.275112 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs podName:70aa7185-ded8-4807-822c-69fc5b03feeb nodeName:}" failed. No retries permitted until 2026-01-28 07:05:04.275096799 +0000 UTC m=+875.690756949 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs") pod "openstack-operator-controller-manager-6fdbb46688-tnzx5" (UID: "70aa7185-ded8-4807-822c-69fc5b03feeb") : secret "webhook-server-cert" not found Jan 28 07:05:03 crc kubenswrapper[4776]: I0128 07:05:03.727165 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert\") pod \"infra-operator-controller-manager-7d75bc88d5-2fbq4\" (UID: \"f9f1432a-2977-49f8-924a-5c82c86f1de0\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" Jan 28 07:05:03 crc kubenswrapper[4776]: E0128 07:05:03.727473 4776 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 28 07:05:03 crc kubenswrapper[4776]: E0128 07:05:03.727840 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert podName:f9f1432a-2977-49f8-924a-5c82c86f1de0 nodeName:}" failed. No retries permitted until 2026-01-28 07:05:11.727799938 +0000 UTC m=+883.143460138 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert") pod "infra-operator-controller-manager-7d75bc88d5-2fbq4" (UID: "f9f1432a-2977-49f8-924a-5c82c86f1de0") : secret "infra-operator-webhook-server-cert" not found Jan 28 07:05:04 crc kubenswrapper[4776]: I0128 07:05:04.032722 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hbb56\" (UID: \"9eae60fd-6135-4e41-bb77-e3caae71237d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" Jan 28 07:05:04 crc kubenswrapper[4776]: E0128 07:05:04.032865 4776 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 07:05:04 crc kubenswrapper[4776]: E0128 07:05:04.032918 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert podName:9eae60fd-6135-4e41-bb77-e3caae71237d nodeName:}" failed. No retries permitted until 2026-01-28 07:05:12.032903833 +0000 UTC m=+883.448563983 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" (UID: "9eae60fd-6135-4e41-bb77-e3caae71237d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 07:05:04 crc kubenswrapper[4776]: I0128 07:05:04.337416 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:05:04 crc kubenswrapper[4776]: I0128 07:05:04.337473 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:05:04 crc kubenswrapper[4776]: E0128 07:05:04.337654 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 28 07:05:04 crc kubenswrapper[4776]: E0128 07:05:04.337692 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 28 07:05:04 crc kubenswrapper[4776]: E0128 07:05:04.337711 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs podName:70aa7185-ded8-4807-822c-69fc5b03feeb nodeName:}" failed. No retries permitted until 2026-01-28 07:05:12.337694369 +0000 UTC m=+883.753354539 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs") pod "openstack-operator-controller-manager-6fdbb46688-tnzx5" (UID: "70aa7185-ded8-4807-822c-69fc5b03feeb") : secret "webhook-server-cert" not found Jan 28 07:05:04 crc kubenswrapper[4776]: E0128 07:05:04.337773 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs podName:70aa7185-ded8-4807-822c-69fc5b03feeb nodeName:}" failed. No retries permitted until 2026-01-28 07:05:12.3377572 +0000 UTC m=+883.753417360 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs") pod "openstack-operator-controller-manager-6fdbb46688-tnzx5" (UID: "70aa7185-ded8-4807-822c-69fc5b03feeb") : secret "metrics-server-cert" not found Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.311607 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.895011 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-vzbmt" event={"ID":"bd49109f-40b2-4db9-92d7-75aaf1093a21","Type":"ContainerStarted","Data":"c1043871ab0e5757935d1866e7d30f9a43b1ffddbe4a4bff4eaeeac2435cde98"} Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.895297 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-vzbmt" Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.900352 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-8dl97" event={"ID":"2f79777a-6f48-42d4-b39e-4393e932aea0","Type":"ContainerStarted","Data":"4fdd4017e964bae03957742c5f0a43d6153933e61e6c888313fb4f08bfe9361f"} Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.901165 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-8dl97" Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.911714 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-pxjl4" event={"ID":"39d3648e-5826-4e8a-b252-cb75e28651db","Type":"ContainerStarted","Data":"66cc94fa2cf4a6609918d0cc0dd37ae08407909271eb835869755ef5acbf4c1e"} Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.912736 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-pxjl4" Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.915096 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-d2j6c" event={"ID":"bf30e81e-a5a3-4af7-9a47-673f431d3666","Type":"ContainerStarted","Data":"fc465cae6313988bc3a80c936162cd1c0dcafea945b5bf6abcf76467805b1fc2"} Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.915910 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-d2j6c" Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.924764 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-lltvt" event={"ID":"93dd9036-0e5e-4817-9a6c-eb89469de01b","Type":"ContainerStarted","Data":"316351587bddade2724b6b1896286de8e90153acb56d98d3e102e3ac97043a14"} Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.925357 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-vzbmt" podStartSLOduration=3.146433403 podStartE2EDuration="15.925310555s" podCreationTimestamp="2026-01-28 07:04:55 +0000 UTC" firstStartedPulling="2026-01-28 07:04:57.526868448 +0000 UTC m=+868.942528608" lastFinishedPulling="2026-01-28 07:05:10.3057456 +0000 UTC m=+881.721405760" observedRunningTime="2026-01-28 07:05:10.923563468 +0000 UTC m=+882.339223628" watchObservedRunningTime="2026-01-28 07:05:10.925310555 +0000 UTC m=+882.340970715" Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.925517 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-lltvt" Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.939212 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8w6p2" event={"ID":"5bc7efb1-0792-40f2-993a-eb865919048c","Type":"ContainerStarted","Data":"8d3801310f076f1e8e2d6f10d3b245da5530e0efe039faa5b7f65bd08005c18e"} Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.939782 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8w6p2" Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.949961 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-pxjl4" podStartSLOduration=3.620728256 podStartE2EDuration="15.949940232s" podCreationTimestamp="2026-01-28 07:04:55 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.056688307 +0000 UTC m=+869.472348467" lastFinishedPulling="2026-01-28 07:05:10.385900273 +0000 UTC m=+881.801560443" observedRunningTime="2026-01-28 07:05:10.944271337 +0000 UTC m=+882.359931497" watchObservedRunningTime="2026-01-28 07:05:10.949940232 +0000 UTC m=+882.365600392" Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.950900 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-575ffb885b-pxsb4" event={"ID":"1a0ddddf-b0e4-4bdb-bf00-c978366213a0","Type":"ContainerStarted","Data":"4db41324585f9af4a3c03455c3d1ab165f53ec4c0167ebfd1fb58940d16a4a90"} Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.951647 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-575ffb885b-pxsb4" Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.953101 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-wwhp5" event={"ID":"6aa705d0-91c0-48eb-a5ed-ab6afb16b6f7","Type":"ContainerStarted","Data":"0f4ac1a15eba732a4ca9f81fc4367fdbef772d2ff8141c276f96b9b0d168c864"} Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.953570 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-wwhp5" Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.958828 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-4mt8c" event={"ID":"846af064-1eb1-4384-9b88-95770199bcdc","Type":"ContainerStarted","Data":"6d6d3401fd6cfe683114b962d2a47b189e33fba5dc78a3168438283945e73901"} Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.959607 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-4mt8c" Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.967477 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-wjdst" event={"ID":"79937ab5-c85f-4a4a-b35f-3b5d3711cbf0","Type":"ContainerStarted","Data":"541f1794dd3f7d1bc036520f79bff40b6c2a3af64c5986eadd75de7f916b2e0b"} Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.968185 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-wjdst" Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.980735 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vwpnd" event={"ID":"60dda427-fb0c-41c7-8ca8-9847554068f1","Type":"ContainerStarted","Data":"fdb6d2b4eb8c73e297142b2b46e9380d56549171cffb2f735394b583f955f3f7"} Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.981350 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vwpnd" Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.988592 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-4xx26" event={"ID":"6a91b170-b0ed-4156-a9ee-74efca2560e7","Type":"ContainerStarted","Data":"c0bda81068a61385dbdec8a6e174c0f332b8685cedd8af4bb45f0bcb50dad663"} Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.989318 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-4xx26" Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.997897 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-8dl97" podStartSLOduration=2.706794262 podStartE2EDuration="14.99787781s" podCreationTimestamp="2026-01-28 07:04:56 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.095780341 +0000 UTC m=+869.511440501" lastFinishedPulling="2026-01-28 07:05:10.386863869 +0000 UTC m=+881.802524049" observedRunningTime="2026-01-28 07:05:10.96695432 +0000 UTC m=+882.382614480" watchObservedRunningTime="2026-01-28 07:05:10.99787781 +0000 UTC m=+882.413537980" Jan 28 07:05:10 crc kubenswrapper[4776]: I0128 07:05:10.998576 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-d2j6c" podStartSLOduration=3.749032162 podStartE2EDuration="15.998570729s" podCreationTimestamp="2026-01-28 07:04:55 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.056948714 +0000 UTC m=+869.472608874" lastFinishedPulling="2026-01-28 07:05:10.306487281 +0000 UTC m=+881.722147441" observedRunningTime="2026-01-28 07:05:10.989669304 +0000 UTC m=+882.405329464" watchObservedRunningTime="2026-01-28 07:05:10.998570729 +0000 UTC m=+882.414230889" Jan 28 07:05:11 crc kubenswrapper[4776]: I0128 07:05:11.014962 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-wjdst" podStartSLOduration=3.236687583 podStartE2EDuration="16.014941298s" podCreationTimestamp="2026-01-28 07:04:55 +0000 UTC" firstStartedPulling="2026-01-28 07:04:57.527570447 +0000 UTC m=+868.943230607" lastFinishedPulling="2026-01-28 07:05:10.305824162 +0000 UTC m=+881.721484322" observedRunningTime="2026-01-28 07:05:11.012143282 +0000 UTC m=+882.427803442" watchObservedRunningTime="2026-01-28 07:05:11.014941298 +0000 UTC m=+882.430601458" Jan 28 07:05:11 crc kubenswrapper[4776]: I0128 07:05:11.051744 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-575ffb885b-pxsb4" podStartSLOduration=3.774149893 podStartE2EDuration="16.05172337s" podCreationTimestamp="2026-01-28 07:04:55 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.02077481 +0000 UTC m=+869.436434970" lastFinishedPulling="2026-01-28 07:05:10.298348277 +0000 UTC m=+881.714008447" observedRunningTime="2026-01-28 07:05:11.045746885 +0000 UTC m=+882.461407055" watchObservedRunningTime="2026-01-28 07:05:11.05172337 +0000 UTC m=+882.467383520" Jan 28 07:05:11 crc kubenswrapper[4776]: I0128 07:05:11.539671 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vwpnd" podStartSLOduration=4.295773557 podStartE2EDuration="16.539652958s" podCreationTimestamp="2026-01-28 07:04:55 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.062294051 +0000 UTC m=+869.477954211" lastFinishedPulling="2026-01-28 07:05:10.306173452 +0000 UTC m=+881.721833612" observedRunningTime="2026-01-28 07:05:11.133763844 +0000 UTC m=+882.549424004" watchObservedRunningTime="2026-01-28 07:05:11.539652958 +0000 UTC m=+882.955313118" Jan 28 07:05:11 crc kubenswrapper[4776]: I0128 07:05:11.542009 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-4mt8c" podStartSLOduration=4.27946502 podStartE2EDuration="16.542000813s" podCreationTimestamp="2026-01-28 07:04:55 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.048684547 +0000 UTC m=+869.464344707" lastFinishedPulling="2026-01-28 07:05:10.31122032 +0000 UTC m=+881.726880500" observedRunningTime="2026-01-28 07:05:11.539390511 +0000 UTC m=+882.955050671" watchObservedRunningTime="2026-01-28 07:05:11.542000813 +0000 UTC m=+882.957660973" Jan 28 07:05:11 crc kubenswrapper[4776]: I0128 07:05:11.587956 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-wwhp5" podStartSLOduration=4.162888495 podStartE2EDuration="16.587938415s" podCreationTimestamp="2026-01-28 07:04:55 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.025917671 +0000 UTC m=+869.441577831" lastFinishedPulling="2026-01-28 07:05:10.450967591 +0000 UTC m=+881.866627751" observedRunningTime="2026-01-28 07:05:11.584911282 +0000 UTC m=+883.000571442" watchObservedRunningTime="2026-01-28 07:05:11.587938415 +0000 UTC m=+883.003598575" Jan 28 07:05:11 crc kubenswrapper[4776]: I0128 07:05:11.656560 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8w6p2" podStartSLOduration=3.419338832 podStartE2EDuration="15.656515919s" podCreationTimestamp="2026-01-28 07:04:56 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.062774814 +0000 UTC m=+869.478434974" lastFinishedPulling="2026-01-28 07:05:10.299951901 +0000 UTC m=+881.715612061" observedRunningTime="2026-01-28 07:05:11.650700399 +0000 UTC m=+883.066360559" watchObservedRunningTime="2026-01-28 07:05:11.656515919 +0000 UTC m=+883.072176079" Jan 28 07:05:11 crc kubenswrapper[4776]: I0128 07:05:11.690122 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-lltvt" podStartSLOduration=4.329834263 podStartE2EDuration="16.690108862s" podCreationTimestamp="2026-01-28 07:04:55 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.025626674 +0000 UTC m=+869.441286824" lastFinishedPulling="2026-01-28 07:05:10.385901243 +0000 UTC m=+881.801561423" observedRunningTime="2026-01-28 07:05:11.688118468 +0000 UTC m=+883.103778628" watchObservedRunningTime="2026-01-28 07:05:11.690108862 +0000 UTC m=+883.105769022" Jan 28 07:05:11 crc kubenswrapper[4776]: I0128 07:05:11.718798 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-4xx26" podStartSLOduration=4.478094017 podStartE2EDuration="16.718781601s" podCreationTimestamp="2026-01-28 07:04:55 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.057354295 +0000 UTC m=+869.473014455" lastFinishedPulling="2026-01-28 07:05:10.298041889 +0000 UTC m=+881.713702039" observedRunningTime="2026-01-28 07:05:11.717900966 +0000 UTC m=+883.133561186" watchObservedRunningTime="2026-01-28 07:05:11.718781601 +0000 UTC m=+883.134441761" Jan 28 07:05:11 crc kubenswrapper[4776]: I0128 07:05:11.748966 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert\") pod \"infra-operator-controller-manager-7d75bc88d5-2fbq4\" (UID: \"f9f1432a-2977-49f8-924a-5c82c86f1de0\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" Jan 28 07:05:11 crc kubenswrapper[4776]: I0128 07:05:11.754081 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9f1432a-2977-49f8-924a-5c82c86f1de0-cert\") pod \"infra-operator-controller-manager-7d75bc88d5-2fbq4\" (UID: \"f9f1432a-2977-49f8-924a-5c82c86f1de0\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" Jan 28 07:05:11 crc kubenswrapper[4776]: I0128 07:05:11.891943 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" Jan 28 07:05:12 crc kubenswrapper[4776]: I0128 07:05:12.052568 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hbb56\" (UID: \"9eae60fd-6135-4e41-bb77-e3caae71237d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" Jan 28 07:05:12 crc kubenswrapper[4776]: E0128 07:05:12.053989 4776 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 07:05:12 crc kubenswrapper[4776]: E0128 07:05:12.054064 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert podName:9eae60fd-6135-4e41-bb77-e3caae71237d nodeName:}" failed. No retries permitted until 2026-01-28 07:05:28.054041033 +0000 UTC m=+899.469701183 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" (UID: "9eae60fd-6135-4e41-bb77-e3caae71237d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 07:05:12 crc kubenswrapper[4776]: I0128 07:05:12.201258 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4"] Jan 28 07:05:12 crc kubenswrapper[4776]: I0128 07:05:12.363122 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:05:12 crc kubenswrapper[4776]: I0128 07:05:12.363246 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:05:12 crc kubenswrapper[4776]: E0128 07:05:12.363358 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 28 07:05:12 crc kubenswrapper[4776]: E0128 07:05:12.363403 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs podName:70aa7185-ded8-4807-822c-69fc5b03feeb nodeName:}" failed. No retries permitted until 2026-01-28 07:05:28.363389984 +0000 UTC m=+899.779050144 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs") pod "openstack-operator-controller-manager-6fdbb46688-tnzx5" (UID: "70aa7185-ded8-4807-822c-69fc5b03feeb") : secret "metrics-server-cert" not found Jan 28 07:05:12 crc kubenswrapper[4776]: E0128 07:05:12.363737 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 28 07:05:12 crc kubenswrapper[4776]: E0128 07:05:12.363764 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs podName:70aa7185-ded8-4807-822c-69fc5b03feeb nodeName:}" failed. No retries permitted until 2026-01-28 07:05:28.363756914 +0000 UTC m=+899.779417074 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs") pod "openstack-operator-controller-manager-6fdbb46688-tnzx5" (UID: "70aa7185-ded8-4807-822c-69fc5b03feeb") : secret "webhook-server-cert" not found Jan 28 07:05:13 crc kubenswrapper[4776]: I0128 07:05:13.033210 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" event={"ID":"f9f1432a-2977-49f8-924a-5c82c86f1de0","Type":"ContainerStarted","Data":"9feca64ac424384e1d6bc897f6960245419c0d634022510d637814ce339b6759"} Jan 28 07:05:15 crc kubenswrapper[4776]: I0128 07:05:15.993893 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-wjdst" Jan 28 07:05:16 crc kubenswrapper[4776]: I0128 07:05:16.002034 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-vzbmt" Jan 28 07:05:16 crc kubenswrapper[4776]: I0128 07:05:16.078084 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-575ffb885b-pxsb4" Jan 28 07:05:16 crc kubenswrapper[4776]: I0128 07:05:16.150351 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vwpnd" Jan 28 07:05:16 crc kubenswrapper[4776]: I0128 07:05:16.193328 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-4mt8c" Jan 28 07:05:16 crc kubenswrapper[4776]: I0128 07:05:16.396519 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-d2j6c" Jan 28 07:05:16 crc kubenswrapper[4776]: I0128 07:05:16.427029 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-pxjl4" Jan 28 07:05:16 crc kubenswrapper[4776]: I0128 07:05:16.430571 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-lltvt" Jan 28 07:05:16 crc kubenswrapper[4776]: I0128 07:05:16.483915 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-wwhp5" Jan 28 07:05:16 crc kubenswrapper[4776]: I0128 07:05:16.559028 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-4xx26" Jan 28 07:05:16 crc kubenswrapper[4776]: I0128 07:05:16.584098 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-8w6p2" Jan 28 07:05:16 crc kubenswrapper[4776]: I0128 07:05:16.666353 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-8dl97" Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.103392 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6kqqd" event={"ID":"9390b35e-9791-4ef4-ab66-12c4662f4cdf","Type":"ContainerStarted","Data":"e6bf6a10797ff3d4b07cb63209c6a6acf28d0df815e56dbbb56690adff67dddb"} Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.104767 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6kqqd" Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.106723 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-vrlcf" event={"ID":"38646136-0a67-43c4-90ee-d88ae407d654","Type":"ContainerStarted","Data":"b5815c16af97fba6ef51ec17f9563893a597062cb8ecbb5f89404fc6ac327091"} Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.107088 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-vrlcf" Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.108347 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" event={"ID":"f9f1432a-2977-49f8-924a-5c82c86f1de0","Type":"ContainerStarted","Data":"35146fec9071e87da80fadc9259298d9c2473375f5d67f09b6bb90ce2a774805"} Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.108461 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.109756 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-94dd99d7d-gxmgb" event={"ID":"a4407ded-de50-4ae5-bf84-2d6a3baa565c","Type":"ContainerStarted","Data":"299cf7801d73350743e2f5afb2409ccfae5cc3e68f0dac6a7a764c08fd3c51dc"} Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.110387 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-94dd99d7d-gxmgb" Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.112941 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8wdwl" event={"ID":"22f3f762-cc29-4a18-8bfc-430b85e041cc","Type":"ContainerStarted","Data":"b6991a141c8cc3e1abb0d0a6fe5f54044b09602bed76ba10f3cbbb95245bd984"} Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.115115 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-xw2v6" event={"ID":"11a6de65-3758-4462-b2b0-9499232f8c29","Type":"ContainerStarted","Data":"d704ea5d7008bb55eab657c4e4c27f0f483df8a42d71c01fbc5cc3fe7886cb07"} Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.115591 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-xw2v6" Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.116887 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-vpc6t" event={"ID":"b5c3560a-18be-4f65-a9f7-0dddccb36193","Type":"ContainerStarted","Data":"a3bf592a49b626289a8c65839bc36423c95147d5d93bcbc4dd250d1e3ef9147b"} Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.117222 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-vpc6t" Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.118532 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-jfcj6" event={"ID":"6f563213-8471-44f5-83aa-820e73ed7746","Type":"ContainerStarted","Data":"d294217fa2228432d6ca65bb4f2b3ff908143f765677534640b30c735923b071"} Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.118922 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-jfcj6" Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.121712 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-66fbd46fdf-dpq5g" event={"ID":"3b6f6ae6-4641-4dd2-9021-197e9ea97b2b","Type":"ContainerStarted","Data":"afd1c3681b82400c7eeb8a893fdb39c11c4060c5ea7fb73108c81d13944c05cb"} Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.122069 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-66fbd46fdf-dpq5g" Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.175990 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8wdwl" podStartSLOduration=2.791032817 podStartE2EDuration="25.175969014s" podCreationTimestamp="2026-01-28 07:04:56 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.205924268 +0000 UTC m=+869.621584428" lastFinishedPulling="2026-01-28 07:05:20.590860465 +0000 UTC m=+892.006520625" observedRunningTime="2026-01-28 07:05:21.171601624 +0000 UTC m=+892.587261784" watchObservedRunningTime="2026-01-28 07:05:21.175969014 +0000 UTC m=+892.591629174" Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.176439 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6kqqd" podStartSLOduration=2.863849578 podStartE2EDuration="25.176435047s" podCreationTimestamp="2026-01-28 07:04:56 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.189292951 +0000 UTC m=+869.604953111" lastFinishedPulling="2026-01-28 07:05:20.5018784 +0000 UTC m=+891.917538580" observedRunningTime="2026-01-28 07:05:21.142209026 +0000 UTC m=+892.557869186" watchObservedRunningTime="2026-01-28 07:05:21.176435047 +0000 UTC m=+892.592095207" Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.209351 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-vpc6t" podStartSLOduration=3.805913675 podStartE2EDuration="26.20933291s" podCreationTimestamp="2026-01-28 07:04:55 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.096219743 +0000 UTC m=+869.511879903" lastFinishedPulling="2026-01-28 07:05:20.499638978 +0000 UTC m=+891.915299138" observedRunningTime="2026-01-28 07:05:21.200625011 +0000 UTC m=+892.616285171" watchObservedRunningTime="2026-01-28 07:05:21.20933291 +0000 UTC m=+892.624993070" Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.238065 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-jfcj6" podStartSLOduration=2.853274698 podStartE2EDuration="25.23804948s" podCreationTimestamp="2026-01-28 07:04:56 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.115060951 +0000 UTC m=+869.530721111" lastFinishedPulling="2026-01-28 07:05:20.499835733 +0000 UTC m=+891.915495893" observedRunningTime="2026-01-28 07:05:21.2322156 +0000 UTC m=+892.647875760" watchObservedRunningTime="2026-01-28 07:05:21.23804948 +0000 UTC m=+892.653709640" Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.337762 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" podStartSLOduration=18.085515448 podStartE2EDuration="26.337741039s" podCreationTimestamp="2026-01-28 07:04:55 +0000 UTC" firstStartedPulling="2026-01-28 07:05:12.253278808 +0000 UTC m=+883.668938968" lastFinishedPulling="2026-01-28 07:05:20.505504399 +0000 UTC m=+891.921164559" observedRunningTime="2026-01-28 07:05:21.261683069 +0000 UTC m=+892.677343239" watchObservedRunningTime="2026-01-28 07:05:21.337741039 +0000 UTC m=+892.753401199" Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.337902 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-94dd99d7d-gxmgb" podStartSLOduration=4.009003037 podStartE2EDuration="26.337897093s" podCreationTimestamp="2026-01-28 07:04:55 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.173891958 +0000 UTC m=+869.589552118" lastFinishedPulling="2026-01-28 07:05:20.502786014 +0000 UTC m=+891.918446174" observedRunningTime="2026-01-28 07:05:21.332571167 +0000 UTC m=+892.748231327" watchObservedRunningTime="2026-01-28 07:05:21.337897093 +0000 UTC m=+892.753557253" Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.373515 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-66fbd46fdf-dpq5g" podStartSLOduration=3.00111223 podStartE2EDuration="25.373495932s" podCreationTimestamp="2026-01-28 07:04:56 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.127453341 +0000 UTC m=+869.543113501" lastFinishedPulling="2026-01-28 07:05:20.499837043 +0000 UTC m=+891.915497203" observedRunningTime="2026-01-28 07:05:21.370465358 +0000 UTC m=+892.786125548" watchObservedRunningTime="2026-01-28 07:05:21.373495932 +0000 UTC m=+892.789156102" Jan 28 07:05:21 crc kubenswrapper[4776]: I0128 07:05:21.408016 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-xw2v6" podStartSLOduration=4.626205486 podStartE2EDuration="26.407996769s" podCreationTimestamp="2026-01-28 07:04:55 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.187996435 +0000 UTC m=+869.603656595" lastFinishedPulling="2026-01-28 07:05:19.969787718 +0000 UTC m=+891.385447878" observedRunningTime="2026-01-28 07:05:21.4040186 +0000 UTC m=+892.819678780" watchObservedRunningTime="2026-01-28 07:05:21.407996769 +0000 UTC m=+892.823656929" Jan 28 07:05:26 crc kubenswrapper[4776]: I0128 07:05:26.029107 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-vpc6t" Jan 28 07:05:26 crc kubenswrapper[4776]: I0128 07:05:26.047741 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-xw2v6" Jan 28 07:05:26 crc kubenswrapper[4776]: I0128 07:05:26.051976 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-vrlcf" podStartSLOduration=9.197537847 podStartE2EDuration="31.051958245s" podCreationTimestamp="2026-01-28 07:04:55 +0000 UTC" firstStartedPulling="2026-01-28 07:04:58.11536877 +0000 UTC m=+869.531028930" lastFinishedPulling="2026-01-28 07:05:19.969789168 +0000 UTC m=+891.385449328" observedRunningTime="2026-01-28 07:05:21.444116292 +0000 UTC m=+892.859776452" watchObservedRunningTime="2026-01-28 07:05:26.051958245 +0000 UTC m=+897.467618405" Jan 28 07:05:26 crc kubenswrapper[4776]: I0128 07:05:26.462228 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-vrlcf" Jan 28 07:05:26 crc kubenswrapper[4776]: I0128 07:05:26.494443 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-94dd99d7d-gxmgb" Jan 28 07:05:26 crc kubenswrapper[4776]: I0128 07:05:26.626588 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6kqqd" Jan 28 07:05:26 crc kubenswrapper[4776]: I0128 07:05:26.641224 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-jfcj6" Jan 28 07:05:26 crc kubenswrapper[4776]: I0128 07:05:26.981759 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-66fbd46fdf-dpq5g" Jan 28 07:05:28 crc kubenswrapper[4776]: I0128 07:05:28.153420 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hbb56\" (UID: \"9eae60fd-6135-4e41-bb77-e3caae71237d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" Jan 28 07:05:28 crc kubenswrapper[4776]: I0128 07:05:28.164469 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9eae60fd-6135-4e41-bb77-e3caae71237d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hbb56\" (UID: \"9eae60fd-6135-4e41-bb77-e3caae71237d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" Jan 28 07:05:28 crc kubenswrapper[4776]: I0128 07:05:28.342594 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" Jan 28 07:05:28 crc kubenswrapper[4776]: I0128 07:05:28.457285 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:05:28 crc kubenswrapper[4776]: I0128 07:05:28.457792 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:05:28 crc kubenswrapper[4776]: I0128 07:05:28.465724 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-metrics-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:05:28 crc kubenswrapper[4776]: I0128 07:05:28.484346 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/70aa7185-ded8-4807-822c-69fc5b03feeb-webhook-certs\") pod \"openstack-operator-controller-manager-6fdbb46688-tnzx5\" (UID: \"70aa7185-ded8-4807-822c-69fc5b03feeb\") " pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:05:28 crc kubenswrapper[4776]: I0128 07:05:28.550038 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:05:28 crc kubenswrapper[4776]: I0128 07:05:28.849504 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56"] Jan 28 07:05:28 crc kubenswrapper[4776]: W0128 07:05:28.850720 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9eae60fd_6135_4e41_bb77_e3caae71237d.slice/crio-bd01be5fb9e0ad5623ef8706229e8ea40c1e9998e8ef61f1b54700b4ae43b73c WatchSource:0}: Error finding container bd01be5fb9e0ad5623ef8706229e8ea40c1e9998e8ef61f1b54700b4ae43b73c: Status 404 returned error can't find the container with id bd01be5fb9e0ad5623ef8706229e8ea40c1e9998e8ef61f1b54700b4ae43b73c Jan 28 07:05:28 crc kubenswrapper[4776]: I0128 07:05:28.982160 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5"] Jan 28 07:05:28 crc kubenswrapper[4776]: W0128 07:05:28.987010 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70aa7185_ded8_4807_822c_69fc5b03feeb.slice/crio-e203279b7e8370f909ff05d42ecb7c170c057085d04b021a5a74ece884948693 WatchSource:0}: Error finding container e203279b7e8370f909ff05d42ecb7c170c057085d04b021a5a74ece884948693: Status 404 returned error can't find the container with id e203279b7e8370f909ff05d42ecb7c170c057085d04b021a5a74ece884948693 Jan 28 07:05:29 crc kubenswrapper[4776]: I0128 07:05:29.179282 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" event={"ID":"9eae60fd-6135-4e41-bb77-e3caae71237d","Type":"ContainerStarted","Data":"bd01be5fb9e0ad5623ef8706229e8ea40c1e9998e8ef61f1b54700b4ae43b73c"} Jan 28 07:05:29 crc kubenswrapper[4776]: I0128 07:05:29.180234 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" event={"ID":"70aa7185-ded8-4807-822c-69fc5b03feeb","Type":"ContainerStarted","Data":"e203279b7e8370f909ff05d42ecb7c170c057085d04b021a5a74ece884948693"} Jan 28 07:05:30 crc kubenswrapper[4776]: I0128 07:05:30.199654 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" event={"ID":"70aa7185-ded8-4807-822c-69fc5b03feeb","Type":"ContainerStarted","Data":"1c87b3a38ca6f5893641b581e2f7727c633e8c951c43f85e0eaf6f82250b5e75"} Jan 28 07:05:30 crc kubenswrapper[4776]: I0128 07:05:30.200030 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:05:30 crc kubenswrapper[4776]: I0128 07:05:30.236339 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" podStartSLOduration=34.236299991 podStartE2EDuration="34.236299991s" podCreationTimestamp="2026-01-28 07:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:05:30.232606339 +0000 UTC m=+901.648266509" watchObservedRunningTime="2026-01-28 07:05:30.236299991 +0000 UTC m=+901.651960161" Jan 28 07:05:31 crc kubenswrapper[4776]: I0128 07:05:31.898089 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-2fbq4" Jan 28 07:05:32 crc kubenswrapper[4776]: I0128 07:05:32.218053 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" event={"ID":"9eae60fd-6135-4e41-bb77-e3caae71237d","Type":"ContainerStarted","Data":"53f55113acf37b8c4284d73efe2dfc4a11612413510470c5bc518c6cb584151a"} Jan 28 07:05:32 crc kubenswrapper[4776]: I0128 07:05:32.218316 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" Jan 28 07:05:32 crc kubenswrapper[4776]: I0128 07:05:32.240060 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" podStartSLOduration=34.020730667 podStartE2EDuration="37.240024743s" podCreationTimestamp="2026-01-28 07:04:55 +0000 UTC" firstStartedPulling="2026-01-28 07:05:28.852873544 +0000 UTC m=+900.268533704" lastFinishedPulling="2026-01-28 07:05:32.07216762 +0000 UTC m=+903.487827780" observedRunningTime="2026-01-28 07:05:32.239987262 +0000 UTC m=+903.655647432" watchObservedRunningTime="2026-01-28 07:05:32.240024743 +0000 UTC m=+903.655684943" Jan 28 07:05:38 crc kubenswrapper[4776]: I0128 07:05:38.353010 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hbb56" Jan 28 07:05:38 crc kubenswrapper[4776]: I0128 07:05:38.557128 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6fdbb46688-tnzx5" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.417000 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-kzv8v"] Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.419062 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-kzv8v" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.424144 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.424873 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-7wbxq" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.425018 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.426775 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.427200 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-kzv8v"] Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.454988 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lmf94"] Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.456179 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lmf94" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.460441 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.492264 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lmf94"] Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.527782 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87491928-b21a-481b-9f85-fcc174c15fc2-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lmf94\" (UID: \"87491928-b21a-481b-9f85-fcc174c15fc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lmf94" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.527849 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfwqg\" (UniqueName: \"kubernetes.io/projected/18891819-cdd4-4c3c-9408-635075bcca14-kube-api-access-bfwqg\") pod \"dnsmasq-dns-675f4bcbfc-kzv8v\" (UID: \"18891819-cdd4-4c3c-9408-635075bcca14\") " pod="openstack/dnsmasq-dns-675f4bcbfc-kzv8v" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.527891 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18891819-cdd4-4c3c-9408-635075bcca14-config\") pod \"dnsmasq-dns-675f4bcbfc-kzv8v\" (UID: \"18891819-cdd4-4c3c-9408-635075bcca14\") " pod="openstack/dnsmasq-dns-675f4bcbfc-kzv8v" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.527921 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87491928-b21a-481b-9f85-fcc174c15fc2-config\") pod \"dnsmasq-dns-78dd6ddcc-lmf94\" (UID: \"87491928-b21a-481b-9f85-fcc174c15fc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lmf94" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.527946 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk8wb\" (UniqueName: \"kubernetes.io/projected/87491928-b21a-481b-9f85-fcc174c15fc2-kube-api-access-mk8wb\") pod \"dnsmasq-dns-78dd6ddcc-lmf94\" (UID: \"87491928-b21a-481b-9f85-fcc174c15fc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lmf94" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.629145 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfwqg\" (UniqueName: \"kubernetes.io/projected/18891819-cdd4-4c3c-9408-635075bcca14-kube-api-access-bfwqg\") pod \"dnsmasq-dns-675f4bcbfc-kzv8v\" (UID: \"18891819-cdd4-4c3c-9408-635075bcca14\") " pod="openstack/dnsmasq-dns-675f4bcbfc-kzv8v" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.629532 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18891819-cdd4-4c3c-9408-635075bcca14-config\") pod \"dnsmasq-dns-675f4bcbfc-kzv8v\" (UID: \"18891819-cdd4-4c3c-9408-635075bcca14\") " pod="openstack/dnsmasq-dns-675f4bcbfc-kzv8v" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.629599 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87491928-b21a-481b-9f85-fcc174c15fc2-config\") pod \"dnsmasq-dns-78dd6ddcc-lmf94\" (UID: \"87491928-b21a-481b-9f85-fcc174c15fc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lmf94" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.629625 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk8wb\" (UniqueName: \"kubernetes.io/projected/87491928-b21a-481b-9f85-fcc174c15fc2-kube-api-access-mk8wb\") pod \"dnsmasq-dns-78dd6ddcc-lmf94\" (UID: \"87491928-b21a-481b-9f85-fcc174c15fc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lmf94" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.629729 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87491928-b21a-481b-9f85-fcc174c15fc2-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lmf94\" (UID: \"87491928-b21a-481b-9f85-fcc174c15fc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lmf94" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.630366 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18891819-cdd4-4c3c-9408-635075bcca14-config\") pod \"dnsmasq-dns-675f4bcbfc-kzv8v\" (UID: \"18891819-cdd4-4c3c-9408-635075bcca14\") " pod="openstack/dnsmasq-dns-675f4bcbfc-kzv8v" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.630525 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87491928-b21a-481b-9f85-fcc174c15fc2-config\") pod \"dnsmasq-dns-78dd6ddcc-lmf94\" (UID: \"87491928-b21a-481b-9f85-fcc174c15fc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lmf94" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.630628 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87491928-b21a-481b-9f85-fcc174c15fc2-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lmf94\" (UID: \"87491928-b21a-481b-9f85-fcc174c15fc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lmf94" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.649292 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfwqg\" (UniqueName: \"kubernetes.io/projected/18891819-cdd4-4c3c-9408-635075bcca14-kube-api-access-bfwqg\") pod \"dnsmasq-dns-675f4bcbfc-kzv8v\" (UID: \"18891819-cdd4-4c3c-9408-635075bcca14\") " pod="openstack/dnsmasq-dns-675f4bcbfc-kzv8v" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.650143 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk8wb\" (UniqueName: \"kubernetes.io/projected/87491928-b21a-481b-9f85-fcc174c15fc2-kube-api-access-mk8wb\") pod \"dnsmasq-dns-78dd6ddcc-lmf94\" (UID: \"87491928-b21a-481b-9f85-fcc174c15fc2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lmf94" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.744318 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-kzv8v" Jan 28 07:05:58 crc kubenswrapper[4776]: I0128 07:05:58.784870 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lmf94" Jan 28 07:05:59 crc kubenswrapper[4776]: I0128 07:05:59.232172 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-kzv8v"] Jan 28 07:05:59 crc kubenswrapper[4776]: I0128 07:05:59.297605 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lmf94"] Jan 28 07:05:59 crc kubenswrapper[4776]: I0128 07:05:59.502996 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-lmf94" event={"ID":"87491928-b21a-481b-9f85-fcc174c15fc2","Type":"ContainerStarted","Data":"f961f04ac941d1d5bf8b4c08219d13eae6107ee19e0cccc83fadc6b28278dca7"} Jan 28 07:05:59 crc kubenswrapper[4776]: I0128 07:05:59.504340 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-kzv8v" event={"ID":"18891819-cdd4-4c3c-9408-635075bcca14","Type":"ContainerStarted","Data":"8f2b5ef7d667ba1ba2bb5710834e04c7eb3b7d3d4eaff57bd43fa901f29b1163"} Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.040232 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-kzv8v"] Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.066761 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-zp7nv"] Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.068350 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.087333 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-zp7nv"] Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.169151 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2x6f\" (UniqueName: \"kubernetes.io/projected/a74f1902-b7a2-4bf8-a558-3382f6790e62-kube-api-access-q2x6f\") pod \"dnsmasq-dns-666b6646f7-zp7nv\" (UID: \"a74f1902-b7a2-4bf8-a558-3382f6790e62\") " pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.169512 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a74f1902-b7a2-4bf8-a558-3382f6790e62-config\") pod \"dnsmasq-dns-666b6646f7-zp7nv\" (UID: \"a74f1902-b7a2-4bf8-a558-3382f6790e62\") " pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.169596 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a74f1902-b7a2-4bf8-a558-3382f6790e62-dns-svc\") pod \"dnsmasq-dns-666b6646f7-zp7nv\" (UID: \"a74f1902-b7a2-4bf8-a558-3382f6790e62\") " pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.271418 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a74f1902-b7a2-4bf8-a558-3382f6790e62-config\") pod \"dnsmasq-dns-666b6646f7-zp7nv\" (UID: \"a74f1902-b7a2-4bf8-a558-3382f6790e62\") " pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.271484 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a74f1902-b7a2-4bf8-a558-3382f6790e62-dns-svc\") pod \"dnsmasq-dns-666b6646f7-zp7nv\" (UID: \"a74f1902-b7a2-4bf8-a558-3382f6790e62\") " pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.271518 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2x6f\" (UniqueName: \"kubernetes.io/projected/a74f1902-b7a2-4bf8-a558-3382f6790e62-kube-api-access-q2x6f\") pod \"dnsmasq-dns-666b6646f7-zp7nv\" (UID: \"a74f1902-b7a2-4bf8-a558-3382f6790e62\") " pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.272661 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a74f1902-b7a2-4bf8-a558-3382f6790e62-config\") pod \"dnsmasq-dns-666b6646f7-zp7nv\" (UID: \"a74f1902-b7a2-4bf8-a558-3382f6790e62\") " pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.273130 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a74f1902-b7a2-4bf8-a558-3382f6790e62-dns-svc\") pod \"dnsmasq-dns-666b6646f7-zp7nv\" (UID: \"a74f1902-b7a2-4bf8-a558-3382f6790e62\") " pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.306612 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2x6f\" (UniqueName: \"kubernetes.io/projected/a74f1902-b7a2-4bf8-a558-3382f6790e62-kube-api-access-q2x6f\") pod \"dnsmasq-dns-666b6646f7-zp7nv\" (UID: \"a74f1902-b7a2-4bf8-a558-3382f6790e62\") " pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.335364 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lmf94"] Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.345012 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t7dww"] Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.346913 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.367290 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t7dww"] Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.395160 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.474009 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-t7dww\" (UID: \"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc\") " pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.474064 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-config\") pod \"dnsmasq-dns-57d769cc4f-t7dww\" (UID: \"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc\") " pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.474137 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dlzx\" (UniqueName: \"kubernetes.io/projected/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-kube-api-access-5dlzx\") pod \"dnsmasq-dns-57d769cc4f-t7dww\" (UID: \"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc\") " pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.575186 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-config\") pod \"dnsmasq-dns-57d769cc4f-t7dww\" (UID: \"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc\") " pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.575622 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dlzx\" (UniqueName: \"kubernetes.io/projected/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-kube-api-access-5dlzx\") pod \"dnsmasq-dns-57d769cc4f-t7dww\" (UID: \"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc\") " pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.575721 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-t7dww\" (UID: \"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc\") " pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.576020 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-config\") pod \"dnsmasq-dns-57d769cc4f-t7dww\" (UID: \"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc\") " pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.576471 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-t7dww\" (UID: \"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc\") " pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.630680 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dlzx\" (UniqueName: \"kubernetes.io/projected/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-kube-api-access-5dlzx\") pod \"dnsmasq-dns-57d769cc4f-t7dww\" (UID: \"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc\") " pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.667038 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" Jan 28 07:06:01 crc kubenswrapper[4776]: I0128 07:06:01.881479 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-zp7nv"] Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.158133 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t7dww"] Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.204475 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.206016 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.211203 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.211234 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.211332 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.211493 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.211628 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-s9p8g" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.211794 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.213305 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.219267 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.291981 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.292046 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.292085 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.292122 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-config-data\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.292147 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.292172 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0aae9df9-4aee-48fa-aa96-4f93f55be39f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.292273 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.292400 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.292447 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0aae9df9-4aee-48fa-aa96-4f93f55be39f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.292470 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.292533 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjg4m\" (UniqueName: \"kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-kube-api-access-sjg4m\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.394866 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.394921 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0aae9df9-4aee-48fa-aa96-4f93f55be39f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.394938 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.394977 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjg4m\" (UniqueName: \"kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-kube-api-access-sjg4m\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.395043 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.395067 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.395085 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.395112 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-config-data\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.395126 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.395142 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0aae9df9-4aee-48fa-aa96-4f93f55be39f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.395174 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.396042 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.396441 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.396931 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.398472 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-config-data\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.400605 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.401111 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.401480 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.402126 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0aae9df9-4aee-48fa-aa96-4f93f55be39f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.402749 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.404186 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0aae9df9-4aee-48fa-aa96-4f93f55be39f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.414714 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjg4m\" (UniqueName: \"kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-kube-api-access-sjg4m\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.418884 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.461815 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.463193 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.470675 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.470847 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.470933 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.471007 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.471113 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.470848 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.471234 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-db6sr" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.471539 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.538887 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.553728 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" event={"ID":"a74f1902-b7a2-4bf8-a558-3382f6790e62","Type":"ContainerStarted","Data":"6767e6e8526d45d44c4f6938c40bb4ef5a8d59b7a4f03d4854e56143e1f604e6"} Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.603180 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.603222 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.603249 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.603286 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.603318 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c544ad4a-db14-419a-b423-435e8416f597-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.603346 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.603372 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.603393 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.603409 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c544ad4a-db14-419a-b423-435e8416f597-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.603427 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.603443 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6xlq\" (UniqueName: \"kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-kube-api-access-p6xlq\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.704128 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.704178 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.704218 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c544ad4a-db14-419a-b423-435e8416f597-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.704248 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.704275 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.704292 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.704312 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c544ad4a-db14-419a-b423-435e8416f597-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.704327 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.704344 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6xlq\" (UniqueName: \"kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-kube-api-access-p6xlq\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.704365 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.704381 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.704882 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.705122 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.705149 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.705934 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.706643 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.706995 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.709894 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.710293 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.710893 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c544ad4a-db14-419a-b423-435e8416f597-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.721136 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c544ad4a-db14-419a-b423-435e8416f597-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.722228 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6xlq\" (UniqueName: \"kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-kube-api-access-p6xlq\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.741645 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:02 crc kubenswrapper[4776]: I0128 07:06:02.790030 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.712917 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.715766 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.718931 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-dbbh5" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.719143 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.720230 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.720446 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.723411 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.725264 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.823591 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.823652 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.823673 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-config-data-default\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.823699 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.823728 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.823748 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-kolla-config\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.823773 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.823791 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw67w\" (UniqueName: \"kubernetes.io/projected/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-kube-api-access-hw67w\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.924824 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.924895 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.924919 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-kolla-config\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.924946 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.924965 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw67w\" (UniqueName: \"kubernetes.io/projected/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-kube-api-access-hw67w\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.925031 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.925075 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.925307 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.925481 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.926010 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-kolla-config\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.926083 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-config-data-default\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.931711 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-config-data-default\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.931747 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.932285 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.943523 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.950593 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw67w\" (UniqueName: \"kubernetes.io/projected/e46126d4-c96e-4d66-9a2e-7f6873a6a1dd-kube-api-access-hw67w\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:03 crc kubenswrapper[4776]: I0128 07:06:03.953299 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd\") " pod="openstack/openstack-galera-0" Jan 28 07:06:04 crc kubenswrapper[4776]: I0128 07:06:04.057226 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 28 07:06:04 crc kubenswrapper[4776]: I0128 07:06:04.908700 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 28 07:06:04 crc kubenswrapper[4776]: I0128 07:06:04.913795 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:04 crc kubenswrapper[4776]: I0128 07:06:04.916326 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-czq4k" Jan 28 07:06:04 crc kubenswrapper[4776]: I0128 07:06:04.917521 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 28 07:06:04 crc kubenswrapper[4776]: I0128 07:06:04.917786 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 28 07:06:04 crc kubenswrapper[4776]: I0128 07:06:04.917943 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 28 07:06:04 crc kubenswrapper[4776]: I0128 07:06:04.948469 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b485d028-58ae-46ec-afd9-720d1a05bade-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:04 crc kubenswrapper[4776]: I0128 07:06:04.948632 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b485d028-58ae-46ec-afd9-720d1a05bade-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:04 crc kubenswrapper[4776]: I0128 07:06:04.948710 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:04 crc kubenswrapper[4776]: I0128 07:06:04.948814 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b485d028-58ae-46ec-afd9-720d1a05bade-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:04 crc kubenswrapper[4776]: I0128 07:06:04.949048 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b485d028-58ae-46ec-afd9-720d1a05bade-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:04 crc kubenswrapper[4776]: I0128 07:06:04.949135 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b485d028-58ae-46ec-afd9-720d1a05bade-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:04 crc kubenswrapper[4776]: I0128 07:06:04.949469 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 28 07:06:04 crc kubenswrapper[4776]: I0128 07:06:04.953086 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b485d028-58ae-46ec-afd9-720d1a05bade-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:04 crc kubenswrapper[4776]: I0128 07:06:04.953178 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t54g8\" (UniqueName: \"kubernetes.io/projected/b485d028-58ae-46ec-afd9-720d1a05bade-kube-api-access-t54g8\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: W0128 07:06:05.030978 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode365bfc7_99d2_4c63_b0ec_fdeafa6eb7bc.slice/crio-c53e7202e8a836549fb5e1d23d37fc42a4204a7854db3aa16f63b1f150a46711 WatchSource:0}: Error finding container c53e7202e8a836549fb5e1d23d37fc42a4204a7854db3aa16f63b1f150a46711: Status 404 returned error can't find the container with id c53e7202e8a836549fb5e1d23d37fc42a4204a7854db3aa16f63b1f150a46711 Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.054245 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b485d028-58ae-46ec-afd9-720d1a05bade-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.054324 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b485d028-58ae-46ec-afd9-720d1a05bade-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.054348 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b485d028-58ae-46ec-afd9-720d1a05bade-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.054379 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t54g8\" (UniqueName: \"kubernetes.io/projected/b485d028-58ae-46ec-afd9-720d1a05bade-kube-api-access-t54g8\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.054457 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b485d028-58ae-46ec-afd9-720d1a05bade-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.054486 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b485d028-58ae-46ec-afd9-720d1a05bade-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.054517 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.054573 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b485d028-58ae-46ec-afd9-720d1a05bade-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.055146 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b485d028-58ae-46ec-afd9-720d1a05bade-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.055408 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b485d028-58ae-46ec-afd9-720d1a05bade-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.055512 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.055580 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b485d028-58ae-46ec-afd9-720d1a05bade-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.057732 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b485d028-58ae-46ec-afd9-720d1a05bade-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.061991 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b485d028-58ae-46ec-afd9-720d1a05bade-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.070464 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b485d028-58ae-46ec-afd9-720d1a05bade-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.081357 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t54g8\" (UniqueName: \"kubernetes.io/projected/b485d028-58ae-46ec-afd9-720d1a05bade-kube-api-access-t54g8\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.085103 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b485d028-58ae-46ec-afd9-720d1a05bade\") " pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.249180 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.250318 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.255511 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.255728 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.255867 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-lpcv9" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.271507 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.289279 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.358399 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrtgg\" (UniqueName: \"kubernetes.io/projected/f1377523-89dd-4311-886a-af2f7bb607b8-kube-api-access-rrtgg\") pod \"memcached-0\" (UID: \"f1377523-89dd-4311-886a-af2f7bb607b8\") " pod="openstack/memcached-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.358499 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1377523-89dd-4311-886a-af2f7bb607b8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f1377523-89dd-4311-886a-af2f7bb607b8\") " pod="openstack/memcached-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.358558 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1377523-89dd-4311-886a-af2f7bb607b8-config-data\") pod \"memcached-0\" (UID: \"f1377523-89dd-4311-886a-af2f7bb607b8\") " pod="openstack/memcached-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.358585 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f1377523-89dd-4311-886a-af2f7bb607b8-kolla-config\") pod \"memcached-0\" (UID: \"f1377523-89dd-4311-886a-af2f7bb607b8\") " pod="openstack/memcached-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.358615 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1377523-89dd-4311-886a-af2f7bb607b8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f1377523-89dd-4311-886a-af2f7bb607b8\") " pod="openstack/memcached-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.460016 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrtgg\" (UniqueName: \"kubernetes.io/projected/f1377523-89dd-4311-886a-af2f7bb607b8-kube-api-access-rrtgg\") pod \"memcached-0\" (UID: \"f1377523-89dd-4311-886a-af2f7bb607b8\") " pod="openstack/memcached-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.460116 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1377523-89dd-4311-886a-af2f7bb607b8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f1377523-89dd-4311-886a-af2f7bb607b8\") " pod="openstack/memcached-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.460164 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1377523-89dd-4311-886a-af2f7bb607b8-config-data\") pod \"memcached-0\" (UID: \"f1377523-89dd-4311-886a-af2f7bb607b8\") " pod="openstack/memcached-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.460187 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f1377523-89dd-4311-886a-af2f7bb607b8-kolla-config\") pod \"memcached-0\" (UID: \"f1377523-89dd-4311-886a-af2f7bb607b8\") " pod="openstack/memcached-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.460219 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1377523-89dd-4311-886a-af2f7bb607b8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f1377523-89dd-4311-886a-af2f7bb607b8\") " pod="openstack/memcached-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.461743 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1377523-89dd-4311-886a-af2f7bb607b8-config-data\") pod \"memcached-0\" (UID: \"f1377523-89dd-4311-886a-af2f7bb607b8\") " pod="openstack/memcached-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.464441 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f1377523-89dd-4311-886a-af2f7bb607b8-kolla-config\") pod \"memcached-0\" (UID: \"f1377523-89dd-4311-886a-af2f7bb607b8\") " pod="openstack/memcached-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.466302 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1377523-89dd-4311-886a-af2f7bb607b8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f1377523-89dd-4311-886a-af2f7bb607b8\") " pod="openstack/memcached-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.472167 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1377523-89dd-4311-886a-af2f7bb607b8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f1377523-89dd-4311-886a-af2f7bb607b8\") " pod="openstack/memcached-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.478716 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrtgg\" (UniqueName: \"kubernetes.io/projected/f1377523-89dd-4311-886a-af2f7bb607b8-kube-api-access-rrtgg\") pod \"memcached-0\" (UID: \"f1377523-89dd-4311-886a-af2f7bb607b8\") " pod="openstack/memcached-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.573342 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 28 07:06:05 crc kubenswrapper[4776]: I0128 07:06:05.579504 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" event={"ID":"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc","Type":"ContainerStarted","Data":"c53e7202e8a836549fb5e1d23d37fc42a4204a7854db3aa16f63b1f150a46711"} Jan 28 07:06:07 crc kubenswrapper[4776]: I0128 07:06:07.099203 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 07:06:07 crc kubenswrapper[4776]: I0128 07:06:07.100164 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 07:06:07 crc kubenswrapper[4776]: I0128 07:06:07.102154 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-jd626" Jan 28 07:06:07 crc kubenswrapper[4776]: I0128 07:06:07.116170 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 07:06:07 crc kubenswrapper[4776]: I0128 07:06:07.197244 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zk8s\" (UniqueName: \"kubernetes.io/projected/9c3e7326-8de9-4923-baae-72484416a58e-kube-api-access-2zk8s\") pod \"kube-state-metrics-0\" (UID: \"9c3e7326-8de9-4923-baae-72484416a58e\") " pod="openstack/kube-state-metrics-0" Jan 28 07:06:07 crc kubenswrapper[4776]: I0128 07:06:07.298192 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zk8s\" (UniqueName: \"kubernetes.io/projected/9c3e7326-8de9-4923-baae-72484416a58e-kube-api-access-2zk8s\") pod \"kube-state-metrics-0\" (UID: \"9c3e7326-8de9-4923-baae-72484416a58e\") " pod="openstack/kube-state-metrics-0" Jan 28 07:06:07 crc kubenswrapper[4776]: I0128 07:06:07.365573 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zk8s\" (UniqueName: \"kubernetes.io/projected/9c3e7326-8de9-4923-baae-72484416a58e-kube-api-access-2zk8s\") pod \"kube-state-metrics-0\" (UID: \"9c3e7326-8de9-4923-baae-72484416a58e\") " pod="openstack/kube-state-metrics-0" Jan 28 07:06:07 crc kubenswrapper[4776]: I0128 07:06:07.425504 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.428677 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.430987 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.433659 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.435272 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.435359 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-lmp7n" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.435422 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.436434 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.436636 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.440629 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.443572 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.450810 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.530484 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.530587 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-config\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.530614 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.530749 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/544a6c48-5eb5-42f0-a46a-0a726d213341-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.530796 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.530832 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.530855 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/544a6c48-5eb5-42f0-a46a-0a726d213341-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.530943 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4246\" (UniqueName: \"kubernetes.io/projected/544a6c48-5eb5-42f0-a46a-0a726d213341-kube-api-access-f4246\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.531042 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.531115 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.634328 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.634371 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.634390 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/544a6c48-5eb5-42f0-a46a-0a726d213341-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.634425 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4246\" (UniqueName: \"kubernetes.io/projected/544a6c48-5eb5-42f0-a46a-0a726d213341-kube-api-access-f4246\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.634446 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.634463 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.634526 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.634594 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-config\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.634621 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.634640 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/544a6c48-5eb5-42f0-a46a-0a726d213341-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.635444 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.635758 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.635877 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.638322 4776 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.639242 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/544a6c48-5eb5-42f0-a46a-0a726d213341-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.639746 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2e9c055391cfae11cfbe6abdf3b945738020df5dfe58cd3a482199e94820340b/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.640161 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.640898 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-config\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.641526 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/544a6c48-5eb5-42f0-a46a-0a726d213341-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.641962 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.653182 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4246\" (UniqueName: \"kubernetes.io/projected/544a6c48-5eb5-42f0-a46a-0a726d213341-kube-api-access-f4246\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.674321 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") pod \"prometheus-metric-storage-0\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:08 crc kubenswrapper[4776]: I0128 07:06:08.759199 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:09 crc kubenswrapper[4776]: I0128 07:06:09.466723 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.449453 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.451298 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.452962 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.454341 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.454502 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.454829 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.455446 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-hp79n" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.467420 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.585262 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmj7c\" (UniqueName: \"kubernetes.io/projected/fd4193d3-abf1-457c-a774-de938b12b909-kube-api-access-dmj7c\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.585333 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4193d3-abf1-457c-a774-de938b12b909-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.585359 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd4193d3-abf1-457c-a774-de938b12b909-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.585383 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd4193d3-abf1-457c-a774-de938b12b909-config\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.585418 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4193d3-abf1-457c-a774-de938b12b909-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.585439 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4193d3-abf1-457c-a774-de938b12b909-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.585472 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.585495 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fd4193d3-abf1-457c-a774-de938b12b909-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.686385 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.686434 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fd4193d3-abf1-457c-a774-de938b12b909-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.686477 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmj7c\" (UniqueName: \"kubernetes.io/projected/fd4193d3-abf1-457c-a774-de938b12b909-kube-api-access-dmj7c\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.686512 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4193d3-abf1-457c-a774-de938b12b909-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.686533 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd4193d3-abf1-457c-a774-de938b12b909-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.686567 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd4193d3-abf1-457c-a774-de938b12b909-config\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.686604 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4193d3-abf1-457c-a774-de938b12b909-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.686625 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4193d3-abf1-457c-a774-de938b12b909-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.687829 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.687947 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fd4193d3-abf1-457c-a774-de938b12b909-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.699830 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd4193d3-abf1-457c-a774-de938b12b909-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.716913 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4193d3-abf1-457c-a774-de938b12b909-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.717198 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd4193d3-abf1-457c-a774-de938b12b909-config\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.719168 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4193d3-abf1-457c-a774-de938b12b909-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.719626 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4193d3-abf1-457c-a774-de938b12b909-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.752750 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.758374 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmj7c\" (UniqueName: \"kubernetes.io/projected/fd4193d3-abf1-457c-a774-de938b12b909-kube-api-access-dmj7c\") pod \"ovsdbserver-nb-0\" (UID: \"fd4193d3-abf1-457c-a774-de938b12b909\") " pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.957468 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lrzjl"] Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.958721 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.964255 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.964442 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.964667 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-tjztr" Jan 28 07:06:11 crc kubenswrapper[4776]: I0128 07:06:11.972121 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lrzjl"] Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.019893 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-mbldl"] Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.021445 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.033252 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-mbldl"] Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.067278 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.097660 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d2cb31b-ab97-4714-9978-225821819328-combined-ca-bundle\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.097710 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d2cb31b-ab97-4714-9978-225821819328-scripts\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.097751 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4d2cb31b-ab97-4714-9978-225821819328-var-run\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.097944 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4d2cb31b-ab97-4714-9978-225821819328-var-run-ovn\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.098065 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xn6h\" (UniqueName: \"kubernetes.io/projected/4d2cb31b-ab97-4714-9978-225821819328-kube-api-access-9xn6h\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.098117 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4d2cb31b-ab97-4714-9978-225821819328-var-log-ovn\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.098156 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d2cb31b-ab97-4714-9978-225821819328-ovn-controller-tls-certs\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.199731 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4d2cb31b-ab97-4714-9978-225821819328-var-run\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.199779 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-var-lib\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.199806 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-scripts\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.199820 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-var-run\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.199858 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4d2cb31b-ab97-4714-9978-225821819328-var-run-ovn\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.200094 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stjm9\" (UniqueName: \"kubernetes.io/projected/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-kube-api-access-stjm9\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.200190 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xn6h\" (UniqueName: \"kubernetes.io/projected/4d2cb31b-ab97-4714-9978-225821819328-kube-api-access-9xn6h\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.200233 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4d2cb31b-ab97-4714-9978-225821819328-var-log-ovn\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.200259 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d2cb31b-ab97-4714-9978-225821819328-ovn-controller-tls-certs\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.200284 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-etc-ovs\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.200292 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4d2cb31b-ab97-4714-9978-225821819328-var-run-ovn\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.200312 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-var-log\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.200355 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4d2cb31b-ab97-4714-9978-225821819328-var-run\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.200465 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4d2cb31b-ab97-4714-9978-225821819328-var-log-ovn\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.200516 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d2cb31b-ab97-4714-9978-225821819328-combined-ca-bundle\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.200594 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d2cb31b-ab97-4714-9978-225821819328-scripts\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.202706 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d2cb31b-ab97-4714-9978-225821819328-scripts\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.205922 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d2cb31b-ab97-4714-9978-225821819328-ovn-controller-tls-certs\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.206521 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d2cb31b-ab97-4714-9978-225821819328-combined-ca-bundle\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.217720 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xn6h\" (UniqueName: \"kubernetes.io/projected/4d2cb31b-ab97-4714-9978-225821819328-kube-api-access-9xn6h\") pod \"ovn-controller-lrzjl\" (UID: \"4d2cb31b-ab97-4714-9978-225821819328\") " pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.278263 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.302482 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-var-lib\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.302534 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-scripts\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.302567 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-var-run\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.302631 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stjm9\" (UniqueName: \"kubernetes.io/projected/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-kube-api-access-stjm9\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.302684 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-etc-ovs\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.302709 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-var-log\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.302833 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-var-run\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.302901 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-var-log\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.302989 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-var-lib\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.303266 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-etc-ovs\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.304748 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-scripts\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.318099 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stjm9\" (UniqueName: \"kubernetes.io/projected/7e98ecc8-0f85-413e-9b5a-4fe838eb9925-kube-api-access-stjm9\") pod \"ovn-controller-ovs-mbldl\" (UID: \"7e98ecc8-0f85-413e-9b5a-4fe838eb9925\") " pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:12 crc kubenswrapper[4776]: I0128 07:06:12.351076 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:13 crc kubenswrapper[4776]: I0128 07:06:13.899247 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 28 07:06:13 crc kubenswrapper[4776]: I0128 07:06:13.907686 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:13 crc kubenswrapper[4776]: I0128 07:06:13.909525 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 28 07:06:13 crc kubenswrapper[4776]: I0128 07:06:13.910622 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 28 07:06:13 crc kubenswrapper[4776]: I0128 07:06:13.911073 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 28 07:06:13 crc kubenswrapper[4776]: I0128 07:06:13.911219 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 28 07:06:13 crc kubenswrapper[4776]: I0128 07:06:13.911075 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-97mjs" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.029798 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e356c55c-adea-433d-9f03-a403f330b085-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.029851 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mjfb\" (UniqueName: \"kubernetes.io/projected/e356c55c-adea-433d-9f03-a403f330b085-kube-api-access-8mjfb\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.029883 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e356c55c-adea-433d-9f03-a403f330b085-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.029910 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e356c55c-adea-433d-9f03-a403f330b085-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.030052 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e356c55c-adea-433d-9f03-a403f330b085-config\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.030107 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e356c55c-adea-433d-9f03-a403f330b085-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.030244 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.030267 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e356c55c-adea-433d-9f03-a403f330b085-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.132197 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.132250 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e356c55c-adea-433d-9f03-a403f330b085-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.132337 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e356c55c-adea-433d-9f03-a403f330b085-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.132368 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mjfb\" (UniqueName: \"kubernetes.io/projected/e356c55c-adea-433d-9f03-a403f330b085-kube-api-access-8mjfb\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.132998 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e356c55c-adea-433d-9f03-a403f330b085-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.132570 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.132935 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e356c55c-adea-433d-9f03-a403f330b085-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.140021 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e356c55c-adea-433d-9f03-a403f330b085-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.140145 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e356c55c-adea-433d-9f03-a403f330b085-config\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.140170 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e356c55c-adea-433d-9f03-a403f330b085-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.141951 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e356c55c-adea-433d-9f03-a403f330b085-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.142467 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e356c55c-adea-433d-9f03-a403f330b085-config\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.150761 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e356c55c-adea-433d-9f03-a403f330b085-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.151264 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e356c55c-adea-433d-9f03-a403f330b085-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.166516 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mjfb\" (UniqueName: \"kubernetes.io/projected/e356c55c-adea-433d-9f03-a403f330b085-kube-api-access-8mjfb\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.167394 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e356c55c-adea-433d-9f03-a403f330b085-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.168727 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e356c55c-adea-433d-9f03-a403f330b085\") " pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:14 crc kubenswrapper[4776]: I0128 07:06:14.232688 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:16 crc kubenswrapper[4776]: I0128 07:06:16.671407 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c544ad4a-db14-419a-b423-435e8416f597","Type":"ContainerStarted","Data":"fb75978b93e00c3c602fc449419650bd95c41c4a2fb0172ca81a4c18a0425fda"} Jan 28 07:06:16 crc kubenswrapper[4776]: E0128 07:06:16.794057 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 28 07:06:16 crc kubenswrapper[4776]: E0128 07:06:16.794251 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mk8wb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-lmf94_openstack(87491928-b21a-481b-9f85-fcc174c15fc2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 07:06:16 crc kubenswrapper[4776]: E0128 07:06:16.795408 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-lmf94" podUID="87491928-b21a-481b-9f85-fcc174c15fc2" Jan 28 07:06:16 crc kubenswrapper[4776]: E0128 07:06:16.885740 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 28 07:06:16 crc kubenswrapper[4776]: E0128 07:06:16.886179 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bfwqg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-kzv8v_openstack(18891819-cdd4-4c3c-9408-635075bcca14): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 07:06:16 crc kubenswrapper[4776]: E0128 07:06:16.888196 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-kzv8v" podUID="18891819-cdd4-4c3c-9408-635075bcca14" Jan 28 07:06:17 crc kubenswrapper[4776]: I0128 07:06:17.352719 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 07:06:17 crc kubenswrapper[4776]: I0128 07:06:17.381196 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 28 07:06:17 crc kubenswrapper[4776]: I0128 07:06:17.533463 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 28 07:06:17 crc kubenswrapper[4776]: W0128 07:06:17.546180 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1377523_89dd_4311_886a_af2f7bb607b8.slice/crio-600ff0c79ce92dcb4d2b93b9843fade57fdefbd3f2b0ad5da1e7dbf8caf411ce WatchSource:0}: Error finding container 600ff0c79ce92dcb4d2b93b9843fade57fdefbd3f2b0ad5da1e7dbf8caf411ce: Status 404 returned error can't find the container with id 600ff0c79ce92dcb4d2b93b9843fade57fdefbd3f2b0ad5da1e7dbf8caf411ce Jan 28 07:06:17 crc kubenswrapper[4776]: I0128 07:06:17.722447 4776 generic.go:334] "Generic (PLEG): container finished" podID="e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc" containerID="33cf747c3abbf6ecb8c8d515760524c0a94a659c4e6bf147005f33cc34c64bfa" exitCode=0 Jan 28 07:06:17 crc kubenswrapper[4776]: I0128 07:06:17.722527 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" event={"ID":"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc","Type":"ContainerDied","Data":"33cf747c3abbf6ecb8c8d515760524c0a94a659c4e6bf147005f33cc34c64bfa"} Jan 28 07:06:17 crc kubenswrapper[4776]: I0128 07:06:17.733794 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd","Type":"ContainerStarted","Data":"710a107f9c6d718b9cb5176d8d2841fbda80601ca61733a2db2869c5c4f1de9e"} Jan 28 07:06:17 crc kubenswrapper[4776]: I0128 07:06:17.738104 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f1377523-89dd-4311-886a-af2f7bb607b8","Type":"ContainerStarted","Data":"600ff0c79ce92dcb4d2b93b9843fade57fdefbd3f2b0ad5da1e7dbf8caf411ce"} Jan 28 07:06:17 crc kubenswrapper[4776]: I0128 07:06:17.747164 4776 generic.go:334] "Generic (PLEG): container finished" podID="a74f1902-b7a2-4bf8-a558-3382f6790e62" containerID="2b21038001efd23da6a90332f653f629c4eb4d3685f6967dd966e7cb1229ccfb" exitCode=0 Jan 28 07:06:17 crc kubenswrapper[4776]: I0128 07:06:17.747259 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" event={"ID":"a74f1902-b7a2-4bf8-a558-3382f6790e62","Type":"ContainerDied","Data":"2b21038001efd23da6a90332f653f629c4eb4d3685f6967dd966e7cb1229ccfb"} Jan 28 07:06:17 crc kubenswrapper[4776]: I0128 07:06:17.749376 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0aae9df9-4aee-48fa-aa96-4f93f55be39f","Type":"ContainerStarted","Data":"79e27f88edee48bf353a689aaa4183425ba6414e5204161a2f2fce5ba58d40e0"} Jan 28 07:06:17 crc kubenswrapper[4776]: I0128 07:06:17.853880 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lrzjl"] Jan 28 07:06:17 crc kubenswrapper[4776]: I0128 07:06:17.876186 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 07:06:17 crc kubenswrapper[4776]: I0128 07:06:17.892453 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 28 07:06:17 crc kubenswrapper[4776]: I0128 07:06:17.955129 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.009310 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.098181 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.181677 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-kzv8v" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.203887 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lmf94" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.322727 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfwqg\" (UniqueName: \"kubernetes.io/projected/18891819-cdd4-4c3c-9408-635075bcca14-kube-api-access-bfwqg\") pod \"18891819-cdd4-4c3c-9408-635075bcca14\" (UID: \"18891819-cdd4-4c3c-9408-635075bcca14\") " Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.322800 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87491928-b21a-481b-9f85-fcc174c15fc2-config\") pod \"87491928-b21a-481b-9f85-fcc174c15fc2\" (UID: \"87491928-b21a-481b-9f85-fcc174c15fc2\") " Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.322853 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87491928-b21a-481b-9f85-fcc174c15fc2-dns-svc\") pod \"87491928-b21a-481b-9f85-fcc174c15fc2\" (UID: \"87491928-b21a-481b-9f85-fcc174c15fc2\") " Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.322982 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk8wb\" (UniqueName: \"kubernetes.io/projected/87491928-b21a-481b-9f85-fcc174c15fc2-kube-api-access-mk8wb\") pod \"87491928-b21a-481b-9f85-fcc174c15fc2\" (UID: \"87491928-b21a-481b-9f85-fcc174c15fc2\") " Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.323120 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18891819-cdd4-4c3c-9408-635075bcca14-config\") pod \"18891819-cdd4-4c3c-9408-635075bcca14\" (UID: \"18891819-cdd4-4c3c-9408-635075bcca14\") " Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.323794 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87491928-b21a-481b-9f85-fcc174c15fc2-config" (OuterVolumeSpecName: "config") pod "87491928-b21a-481b-9f85-fcc174c15fc2" (UID: "87491928-b21a-481b-9f85-fcc174c15fc2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.323835 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18891819-cdd4-4c3c-9408-635075bcca14-config" (OuterVolumeSpecName: "config") pod "18891819-cdd4-4c3c-9408-635075bcca14" (UID: "18891819-cdd4-4c3c-9408-635075bcca14"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.324036 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87491928-b21a-481b-9f85-fcc174c15fc2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87491928-b21a-481b-9f85-fcc174c15fc2" (UID: "87491928-b21a-481b-9f85-fcc174c15fc2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.328737 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18891819-cdd4-4c3c-9408-635075bcca14-kube-api-access-bfwqg" (OuterVolumeSpecName: "kube-api-access-bfwqg") pod "18891819-cdd4-4c3c-9408-635075bcca14" (UID: "18891819-cdd4-4c3c-9408-635075bcca14"). InnerVolumeSpecName "kube-api-access-bfwqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.329958 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87491928-b21a-481b-9f85-fcc174c15fc2-kube-api-access-mk8wb" (OuterVolumeSpecName: "kube-api-access-mk8wb") pod "87491928-b21a-481b-9f85-fcc174c15fc2" (UID: "87491928-b21a-481b-9f85-fcc174c15fc2"). InnerVolumeSpecName "kube-api-access-mk8wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.424578 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87491928-b21a-481b-9f85-fcc174c15fc2-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.424858 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk8wb\" (UniqueName: \"kubernetes.io/projected/87491928-b21a-481b-9f85-fcc174c15fc2-kube-api-access-mk8wb\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.424870 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18891819-cdd4-4c3c-9408-635075bcca14-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.424880 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfwqg\" (UniqueName: \"kubernetes.io/projected/18891819-cdd4-4c3c-9408-635075bcca14-kube-api-access-bfwqg\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.424889 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87491928-b21a-481b-9f85-fcc174c15fc2-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.759214 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" event={"ID":"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc","Type":"ContainerStarted","Data":"0479e45791f23fc97d375c97d7fa3c7220435eae00d8033bdb9fd35e46fabe0e"} Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.759394 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.762456 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e356c55c-adea-433d-9f03-a403f330b085","Type":"ContainerStarted","Data":"2b286087c7b8c51f5adb875789b03e0ee27818ff58b5c6a2e28bcd5a7be3d0aa"} Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.764080 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fd4193d3-abf1-457c-a774-de938b12b909","Type":"ContainerStarted","Data":"eb1a1c4d9bdb328777102a62230489aa02a70340a51bca2e74f922130aa9e93a"} Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.766000 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lrzjl" event={"ID":"4d2cb31b-ab97-4714-9978-225821819328","Type":"ContainerStarted","Data":"56a08a719a14730a65318a85d39a0ca51360861e647f8b379c1ae053a0f8754f"} Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.769320 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"544a6c48-5eb5-42f0-a46a-0a726d213341","Type":"ContainerStarted","Data":"b03badf491eb8fef69f99c8b719a8486d3e96012a7607158885cf521b5b4a741"} Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.773010 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9c3e7326-8de9-4923-baae-72484416a58e","Type":"ContainerStarted","Data":"6b541f073377216d9d36f31f2ab3a9284023f3f535a47398a1b1229d83653a7c"} Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.792643 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" podStartSLOduration=5.885053068 podStartE2EDuration="17.79262556s" podCreationTimestamp="2026-01-28 07:06:01 +0000 UTC" firstStartedPulling="2026-01-28 07:06:05.033694096 +0000 UTC m=+936.449354256" lastFinishedPulling="2026-01-28 07:06:16.941266588 +0000 UTC m=+948.356926748" observedRunningTime="2026-01-28 07:06:18.784293244 +0000 UTC m=+950.199953414" watchObservedRunningTime="2026-01-28 07:06:18.79262556 +0000 UTC m=+950.208285720" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.809291 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" event={"ID":"a74f1902-b7a2-4bf8-a558-3382f6790e62","Type":"ContainerStarted","Data":"020cafdafca79fc1a8134d6767343fcaa3c48e5da3c900a1b4b7a016c62beaf8"} Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.810345 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.815089 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-kzv8v" event={"ID":"18891819-cdd4-4c3c-9408-635075bcca14","Type":"ContainerDied","Data":"8f2b5ef7d667ba1ba2bb5710834e04c7eb3b7d3d4eaff57bd43fa901f29b1163"} Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.815145 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-kzv8v" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.817208 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b485d028-58ae-46ec-afd9-720d1a05bade","Type":"ContainerStarted","Data":"46b62045a160ab68da9220641fac45b416793cb5482026015d53288b54602aff"} Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.818118 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-lmf94" event={"ID":"87491928-b21a-481b-9f85-fcc174c15fc2","Type":"ContainerDied","Data":"f961f04ac941d1d5bf8b4c08219d13eae6107ee19e0cccc83fadc6b28278dca7"} Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.818185 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lmf94" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.830314 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" podStartSLOduration=2.802647172 podStartE2EDuration="17.830295742s" podCreationTimestamp="2026-01-28 07:06:01 +0000 UTC" firstStartedPulling="2026-01-28 07:06:01.897643924 +0000 UTC m=+933.313304084" lastFinishedPulling="2026-01-28 07:06:16.925292494 +0000 UTC m=+948.340952654" observedRunningTime="2026-01-28 07:06:18.827467045 +0000 UTC m=+950.243127205" watchObservedRunningTime="2026-01-28 07:06:18.830295742 +0000 UTC m=+950.245955902" Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.956110 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lmf94"] Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.966490 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lmf94"] Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.977838 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-kzv8v"] Jan 28 07:06:18 crc kubenswrapper[4776]: I0128 07:06:18.982652 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-kzv8v"] Jan 28 07:06:19 crc kubenswrapper[4776]: I0128 07:06:19.098935 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-mbldl"] Jan 28 07:06:19 crc kubenswrapper[4776]: I0128 07:06:19.321964 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18891819-cdd4-4c3c-9408-635075bcca14" path="/var/lib/kubelet/pods/18891819-cdd4-4c3c-9408-635075bcca14/volumes" Jan 28 07:06:19 crc kubenswrapper[4776]: I0128 07:06:19.322406 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87491928-b21a-481b-9f85-fcc174c15fc2" path="/var/lib/kubelet/pods/87491928-b21a-481b-9f85-fcc174c15fc2/volumes" Jan 28 07:06:21 crc kubenswrapper[4776]: I0128 07:06:21.853942 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mbldl" event={"ID":"7e98ecc8-0f85-413e-9b5a-4fe838eb9925","Type":"ContainerStarted","Data":"c04cc6fc4a0894651b87bd878060c9d37d0b0611a97fdfbed50c93cd08647464"} Jan 28 07:06:26 crc kubenswrapper[4776]: I0128 07:06:26.396695 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" Jan 28 07:06:26 crc kubenswrapper[4776]: I0128 07:06:26.669431 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" Jan 28 07:06:26 crc kubenswrapper[4776]: I0128 07:06:26.724332 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-zp7nv"] Jan 28 07:06:26 crc kubenswrapper[4776]: I0128 07:06:26.894527 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" podUID="a74f1902-b7a2-4bf8-a558-3382f6790e62" containerName="dnsmasq-dns" containerID="cri-o://020cafdafca79fc1a8134d6767343fcaa3c48e5da3c900a1b4b7a016c62beaf8" gracePeriod=10 Jan 28 07:06:27 crc kubenswrapper[4776]: I0128 07:06:27.829123 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" Jan 28 07:06:27 crc kubenswrapper[4776]: I0128 07:06:27.845881 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a74f1902-b7a2-4bf8-a558-3382f6790e62-dns-svc\") pod \"a74f1902-b7a2-4bf8-a558-3382f6790e62\" (UID: \"a74f1902-b7a2-4bf8-a558-3382f6790e62\") " Jan 28 07:06:27 crc kubenswrapper[4776]: I0128 07:06:27.846017 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a74f1902-b7a2-4bf8-a558-3382f6790e62-config\") pod \"a74f1902-b7a2-4bf8-a558-3382f6790e62\" (UID: \"a74f1902-b7a2-4bf8-a558-3382f6790e62\") " Jan 28 07:06:27 crc kubenswrapper[4776]: I0128 07:06:27.846180 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2x6f\" (UniqueName: \"kubernetes.io/projected/a74f1902-b7a2-4bf8-a558-3382f6790e62-kube-api-access-q2x6f\") pod \"a74f1902-b7a2-4bf8-a558-3382f6790e62\" (UID: \"a74f1902-b7a2-4bf8-a558-3382f6790e62\") " Jan 28 07:06:27 crc kubenswrapper[4776]: I0128 07:06:27.896795 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a74f1902-b7a2-4bf8-a558-3382f6790e62-kube-api-access-q2x6f" (OuterVolumeSpecName: "kube-api-access-q2x6f") pod "a74f1902-b7a2-4bf8-a558-3382f6790e62" (UID: "a74f1902-b7a2-4bf8-a558-3382f6790e62"). InnerVolumeSpecName "kube-api-access-q2x6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:06:27 crc kubenswrapper[4776]: I0128 07:06:27.910018 4776 generic.go:334] "Generic (PLEG): container finished" podID="a74f1902-b7a2-4bf8-a558-3382f6790e62" containerID="020cafdafca79fc1a8134d6767343fcaa3c48e5da3c900a1b4b7a016c62beaf8" exitCode=0 Jan 28 07:06:27 crc kubenswrapper[4776]: I0128 07:06:27.910071 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" event={"ID":"a74f1902-b7a2-4bf8-a558-3382f6790e62","Type":"ContainerDied","Data":"020cafdafca79fc1a8134d6767343fcaa3c48e5da3c900a1b4b7a016c62beaf8"} Jan 28 07:06:27 crc kubenswrapper[4776]: I0128 07:06:27.910102 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" event={"ID":"a74f1902-b7a2-4bf8-a558-3382f6790e62","Type":"ContainerDied","Data":"6767e6e8526d45d44c4f6938c40bb4ef5a8d59b7a4f03d4854e56143e1f604e6"} Jan 28 07:06:27 crc kubenswrapper[4776]: I0128 07:06:27.910098 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-zp7nv" Jan 28 07:06:27 crc kubenswrapper[4776]: I0128 07:06:27.910118 4776 scope.go:117] "RemoveContainer" containerID="020cafdafca79fc1a8134d6767343fcaa3c48e5da3c900a1b4b7a016c62beaf8" Jan 28 07:06:27 crc kubenswrapper[4776]: I0128 07:06:27.921191 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a74f1902-b7a2-4bf8-a558-3382f6790e62-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a74f1902-b7a2-4bf8-a558-3382f6790e62" (UID: "a74f1902-b7a2-4bf8-a558-3382f6790e62"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:27 crc kubenswrapper[4776]: I0128 07:06:27.926916 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a74f1902-b7a2-4bf8-a558-3382f6790e62-config" (OuterVolumeSpecName: "config") pod "a74f1902-b7a2-4bf8-a558-3382f6790e62" (UID: "a74f1902-b7a2-4bf8-a558-3382f6790e62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:27 crc kubenswrapper[4776]: I0128 07:06:27.948514 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2x6f\" (UniqueName: \"kubernetes.io/projected/a74f1902-b7a2-4bf8-a558-3382f6790e62-kube-api-access-q2x6f\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:27 crc kubenswrapper[4776]: I0128 07:06:27.948560 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a74f1902-b7a2-4bf8-a558-3382f6790e62-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:27 crc kubenswrapper[4776]: I0128 07:06:27.948571 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a74f1902-b7a2-4bf8-a558-3382f6790e62-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:28 crc kubenswrapper[4776]: I0128 07:06:28.252787 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-zp7nv"] Jan 28 07:06:28 crc kubenswrapper[4776]: I0128 07:06:28.257914 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-zp7nv"] Jan 28 07:06:28 crc kubenswrapper[4776]: I0128 07:06:28.428142 4776 scope.go:117] "RemoveContainer" containerID="2b21038001efd23da6a90332f653f629c4eb4d3685f6967dd966e7cb1229ccfb" Jan 28 07:06:28 crc kubenswrapper[4776]: I0128 07:06:28.478851 4776 scope.go:117] "RemoveContainer" containerID="020cafdafca79fc1a8134d6767343fcaa3c48e5da3c900a1b4b7a016c62beaf8" Jan 28 07:06:28 crc kubenswrapper[4776]: E0128 07:06:28.480319 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"020cafdafca79fc1a8134d6767343fcaa3c48e5da3c900a1b4b7a016c62beaf8\": container with ID starting with 020cafdafca79fc1a8134d6767343fcaa3c48e5da3c900a1b4b7a016c62beaf8 not found: ID does not exist" containerID="020cafdafca79fc1a8134d6767343fcaa3c48e5da3c900a1b4b7a016c62beaf8" Jan 28 07:06:28 crc kubenswrapper[4776]: I0128 07:06:28.480362 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"020cafdafca79fc1a8134d6767343fcaa3c48e5da3c900a1b4b7a016c62beaf8"} err="failed to get container status \"020cafdafca79fc1a8134d6767343fcaa3c48e5da3c900a1b4b7a016c62beaf8\": rpc error: code = NotFound desc = could not find container \"020cafdafca79fc1a8134d6767343fcaa3c48e5da3c900a1b4b7a016c62beaf8\": container with ID starting with 020cafdafca79fc1a8134d6767343fcaa3c48e5da3c900a1b4b7a016c62beaf8 not found: ID does not exist" Jan 28 07:06:28 crc kubenswrapper[4776]: I0128 07:06:28.480388 4776 scope.go:117] "RemoveContainer" containerID="2b21038001efd23da6a90332f653f629c4eb4d3685f6967dd966e7cb1229ccfb" Jan 28 07:06:28 crc kubenswrapper[4776]: E0128 07:06:28.480717 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b21038001efd23da6a90332f653f629c4eb4d3685f6967dd966e7cb1229ccfb\": container with ID starting with 2b21038001efd23da6a90332f653f629c4eb4d3685f6967dd966e7cb1229ccfb not found: ID does not exist" containerID="2b21038001efd23da6a90332f653f629c4eb4d3685f6967dd966e7cb1229ccfb" Jan 28 07:06:28 crc kubenswrapper[4776]: I0128 07:06:28.480743 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b21038001efd23da6a90332f653f629c4eb4d3685f6967dd966e7cb1229ccfb"} err="failed to get container status \"2b21038001efd23da6a90332f653f629c4eb4d3685f6967dd966e7cb1229ccfb\": rpc error: code = NotFound desc = could not find container \"2b21038001efd23da6a90332f653f629c4eb4d3685f6967dd966e7cb1229ccfb\": container with ID starting with 2b21038001efd23da6a90332f653f629c4eb4d3685f6967dd966e7cb1229ccfb not found: ID does not exist" Jan 28 07:06:28 crc kubenswrapper[4776]: I0128 07:06:28.992905 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-59r62"] Jan 28 07:06:28 crc kubenswrapper[4776]: E0128 07:06:28.993516 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a74f1902-b7a2-4bf8-a558-3382f6790e62" containerName="dnsmasq-dns" Jan 28 07:06:28 crc kubenswrapper[4776]: I0128 07:06:28.993530 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a74f1902-b7a2-4bf8-a558-3382f6790e62" containerName="dnsmasq-dns" Jan 28 07:06:28 crc kubenswrapper[4776]: E0128 07:06:28.993581 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a74f1902-b7a2-4bf8-a558-3382f6790e62" containerName="init" Jan 28 07:06:28 crc kubenswrapper[4776]: I0128 07:06:28.993589 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a74f1902-b7a2-4bf8-a558-3382f6790e62" containerName="init" Jan 28 07:06:28 crc kubenswrapper[4776]: I0128 07:06:28.993748 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a74f1902-b7a2-4bf8-a558-3382f6790e62" containerName="dnsmasq-dns" Jan 28 07:06:28 crc kubenswrapper[4776]: I0128 07:06:28.994282 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:28 crc kubenswrapper[4776]: I0128 07:06:28.996719 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.000740 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-59r62"] Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.066132 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a390a227-8301-4ed3-80ee-06131089f499-ovn-rundir\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.066185 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmq74\" (UniqueName: \"kubernetes.io/projected/a390a227-8301-4ed3-80ee-06131089f499-kube-api-access-nmq74\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.066237 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a390a227-8301-4ed3-80ee-06131089f499-combined-ca-bundle\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.066258 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a390a227-8301-4ed3-80ee-06131089f499-config\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.066299 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a390a227-8301-4ed3-80ee-06131089f499-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.066314 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a390a227-8301-4ed3-80ee-06131089f499-ovs-rundir\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.113146 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-9hx9z"] Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.119886 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.121937 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.126943 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-9hx9z"] Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.167153 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-config\") pod \"dnsmasq-dns-7fd796d7df-9hx9z\" (UID: \"a81879d5-73b4-4375-929a-9c7ebf311145\") " pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.167213 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a390a227-8301-4ed3-80ee-06131089f499-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.167235 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a390a227-8301-4ed3-80ee-06131089f499-ovs-rundir\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.167287 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-9hx9z\" (UID: \"a81879d5-73b4-4375-929a-9c7ebf311145\") " pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.167309 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a390a227-8301-4ed3-80ee-06131089f499-ovn-rundir\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.167328 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spl25\" (UniqueName: \"kubernetes.io/projected/a81879d5-73b4-4375-929a-9c7ebf311145-kube-api-access-spl25\") pod \"dnsmasq-dns-7fd796d7df-9hx9z\" (UID: \"a81879d5-73b4-4375-929a-9c7ebf311145\") " pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.167349 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmq74\" (UniqueName: \"kubernetes.io/projected/a390a227-8301-4ed3-80ee-06131089f499-kube-api-access-nmq74\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.167399 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a390a227-8301-4ed3-80ee-06131089f499-combined-ca-bundle\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.167420 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a390a227-8301-4ed3-80ee-06131089f499-config\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.167446 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-9hx9z\" (UID: \"a81879d5-73b4-4375-929a-9c7ebf311145\") " pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.168456 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a390a227-8301-4ed3-80ee-06131089f499-ovs-rundir\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.168524 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a390a227-8301-4ed3-80ee-06131089f499-ovn-rundir\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.169245 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a390a227-8301-4ed3-80ee-06131089f499-config\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.194525 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a390a227-8301-4ed3-80ee-06131089f499-combined-ca-bundle\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.194658 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmq74\" (UniqueName: \"kubernetes.io/projected/a390a227-8301-4ed3-80ee-06131089f499-kube-api-access-nmq74\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.195738 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a390a227-8301-4ed3-80ee-06131089f499-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-59r62\" (UID: \"a390a227-8301-4ed3-80ee-06131089f499\") " pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.253766 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-9hx9z"] Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.268749 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-9hx9z\" (UID: \"a81879d5-73b4-4375-929a-9c7ebf311145\") " pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.268800 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spl25\" (UniqueName: \"kubernetes.io/projected/a81879d5-73b4-4375-929a-9c7ebf311145-kube-api-access-spl25\") pod \"dnsmasq-dns-7fd796d7df-9hx9z\" (UID: \"a81879d5-73b4-4375-929a-9c7ebf311145\") " pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.268864 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-9hx9z\" (UID: \"a81879d5-73b4-4375-929a-9c7ebf311145\") " pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.268887 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-config\") pod \"dnsmasq-dns-7fd796d7df-9hx9z\" (UID: \"a81879d5-73b4-4375-929a-9c7ebf311145\") " pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.269737 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-config\") pod \"dnsmasq-dns-7fd796d7df-9hx9z\" (UID: \"a81879d5-73b4-4375-929a-9c7ebf311145\") " pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.270234 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-9hx9z\" (UID: \"a81879d5-73b4-4375-929a-9c7ebf311145\") " pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.271602 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2hz7d"] Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.272449 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.274097 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.276000 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.281981 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-9hx9z\" (UID: \"a81879d5-73b4-4375-929a-9c7ebf311145\") " pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.283797 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2hz7d"] Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.307324 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spl25\" (UniqueName: \"kubernetes.io/projected/a81879d5-73b4-4375-929a-9c7ebf311145-kube-api-access-spl25\") pod \"dnsmasq-dns-7fd796d7df-9hx9z\" (UID: \"a81879d5-73b4-4375-929a-9c7ebf311145\") " pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.321378 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a74f1902-b7a2-4bf8-a558-3382f6790e62" path="/var/lib/kubelet/pods/a74f1902-b7a2-4bf8-a558-3382f6790e62/volumes" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.370865 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-2hz7d\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.370920 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-2hz7d\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.370983 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-config\") pod \"dnsmasq-dns-86db49b7ff-2hz7d\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.371020 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-2hz7d\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.371132 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q5gg\" (UniqueName: \"kubernetes.io/projected/088e8d0a-b7b2-48dc-9920-6d49987066a5-kube-api-access-4q5gg\") pod \"dnsmasq-dns-86db49b7ff-2hz7d\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.472652 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-config\") pod \"dnsmasq-dns-86db49b7ff-2hz7d\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.472716 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-2hz7d\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.472814 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q5gg\" (UniqueName: \"kubernetes.io/projected/088e8d0a-b7b2-48dc-9920-6d49987066a5-kube-api-access-4q5gg\") pod \"dnsmasq-dns-86db49b7ff-2hz7d\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.472886 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-2hz7d\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.472924 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-2hz7d\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.473736 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-config\") pod \"dnsmasq-dns-86db49b7ff-2hz7d\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.473776 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-2hz7d\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.473897 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-2hz7d\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.474282 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-2hz7d\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.489217 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q5gg\" (UniqueName: \"kubernetes.io/projected/088e8d0a-b7b2-48dc-9920-6d49987066a5-kube-api-access-4q5gg\") pod \"dnsmasq-dns-86db49b7ff-2hz7d\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.928448 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f1377523-89dd-4311-886a-af2f7bb607b8","Type":"ContainerStarted","Data":"77f74a6ecdc5ad7aae60f442aa564bca35e3e6c50018f01ac04ff3bfca9427e6"} Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.928607 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.932194 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lrzjl" event={"ID":"4d2cb31b-ab97-4714-9978-225821819328","Type":"ContainerStarted","Data":"d768c60cc1e8ed50ac65bdc90297c54bf891059c17abc2404fb160590794a6c4"} Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.932477 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-lrzjl" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.933973 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0aae9df9-4aee-48fa-aa96-4f93f55be39f","Type":"ContainerStarted","Data":"5779391dd89a09e63052867e602f4c12048c07c8fc455061db7d0096bcec5503"} Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.935229 4776 generic.go:334] "Generic (PLEG): container finished" podID="7e98ecc8-0f85-413e-9b5a-4fe838eb9925" containerID="e4a03b332c36b30ec7f966468cdb4091473e6cc3ef75ceae8a07477f6fa4572c" exitCode=0 Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.935277 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mbldl" event={"ID":"7e98ecc8-0f85-413e-9b5a-4fe838eb9925","Type":"ContainerDied","Data":"e4a03b332c36b30ec7f966468cdb4091473e6cc3ef75ceae8a07477f6fa4572c"} Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.947392 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b485d028-58ae-46ec-afd9-720d1a05bade","Type":"ContainerStarted","Data":"a7395e416a36ac06e72819ad5c383900da5a9799ab8da6bfbce4c5320beab9a0"} Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.951521 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd","Type":"ContainerStarted","Data":"33f4a4d03c6d8a13ed210359215cb04e251a765a95d7dc559081682cc20f68b9"} Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.956177 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fd4193d3-abf1-457c-a774-de938b12b909","Type":"ContainerStarted","Data":"d2de915d321c3267f0b7b1ccf08147c9b9a55cbd40eefe85d4ce9722c0dec24a"} Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.958112 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.606339251 podStartE2EDuration="24.95809136s" podCreationTimestamp="2026-01-28 07:06:05 +0000 UTC" firstStartedPulling="2026-01-28 07:06:17.555402707 +0000 UTC m=+948.971062867" lastFinishedPulling="2026-01-28 07:06:26.907154816 +0000 UTC m=+958.322814976" observedRunningTime="2026-01-28 07:06:29.949191649 +0000 UTC m=+961.364851809" watchObservedRunningTime="2026-01-28 07:06:29.95809136 +0000 UTC m=+961.373751520" Jan 28 07:06:29 crc kubenswrapper[4776]: I0128 07:06:29.974650 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lrzjl" podStartSLOduration=9.470614679 podStartE2EDuration="18.974630939s" podCreationTimestamp="2026-01-28 07:06:11 +0000 UTC" firstStartedPulling="2026-01-28 07:06:17.950870025 +0000 UTC m=+949.366530185" lastFinishedPulling="2026-01-28 07:06:27.454886285 +0000 UTC m=+958.870546445" observedRunningTime="2026-01-28 07:06:29.971275818 +0000 UTC m=+961.386935978" watchObservedRunningTime="2026-01-28 07:06:29.974630939 +0000 UTC m=+961.390291099" Jan 28 07:06:30 crc kubenswrapper[4776]: I0128 07:06:30.138426 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-59r62" Jan 28 07:06:30 crc kubenswrapper[4776]: I0128 07:06:30.165220 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" Jan 28 07:06:30 crc kubenswrapper[4776]: I0128 07:06:30.185373 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:30 crc kubenswrapper[4776]: I0128 07:06:30.908919 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-9hx9z"] Jan 28 07:06:30 crc kubenswrapper[4776]: I0128 07:06:30.974363 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" event={"ID":"a81879d5-73b4-4375-929a-9c7ebf311145","Type":"ContainerStarted","Data":"4effb2fe2b680ddee9985f8807e468110415f93ba6b3c432fdef67a37fcb235c"} Jan 28 07:06:30 crc kubenswrapper[4776]: I0128 07:06:30.976992 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"544a6c48-5eb5-42f0-a46a-0a726d213341","Type":"ContainerStarted","Data":"35c90678e12bea8f8da79ced725be71f64fcdb713c8b62dacebabb5f70075b29"} Jan 28 07:06:30 crc kubenswrapper[4776]: I0128 07:06:30.997632 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9c3e7326-8de9-4923-baae-72484416a58e","Type":"ContainerStarted","Data":"ae2ae2e12fce3bbdcda2070e7d8681b9ef2a01e8c68f05397218a734a5b1c4e5"} Jan 28 07:06:30 crc kubenswrapper[4776]: I0128 07:06:30.998285 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 28 07:06:30 crc kubenswrapper[4776]: I0128 07:06:30.999094 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-59r62"] Jan 28 07:06:31 crc kubenswrapper[4776]: I0128 07:06:31.011655 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2hz7d"] Jan 28 07:06:31 crc kubenswrapper[4776]: I0128 07:06:31.016015 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mbldl" event={"ID":"7e98ecc8-0f85-413e-9b5a-4fe838eb9925","Type":"ContainerStarted","Data":"b1a700a546fca2d344d542a077bb52e971ade3e224555a2e9bba779b6a32cb8c"} Jan 28 07:06:31 crc kubenswrapper[4776]: I0128 07:06:31.016060 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mbldl" event={"ID":"7e98ecc8-0f85-413e-9b5a-4fe838eb9925","Type":"ContainerStarted","Data":"29a52c687cdcaebb71610d596236561902215137e6d1ba6a0c6a909e4d020590"} Jan 28 07:06:31 crc kubenswrapper[4776]: I0128 07:06:31.016114 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:31 crc kubenswrapper[4776]: I0128 07:06:31.016127 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:06:31 crc kubenswrapper[4776]: I0128 07:06:31.024309 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.390520397 podStartE2EDuration="24.024291834s" podCreationTimestamp="2026-01-28 07:06:07 +0000 UTC" firstStartedPulling="2026-01-28 07:06:17.929456025 +0000 UTC m=+949.345116185" lastFinishedPulling="2026-01-28 07:06:28.563227462 +0000 UTC m=+959.978887622" observedRunningTime="2026-01-28 07:06:31.017270594 +0000 UTC m=+962.432930754" watchObservedRunningTime="2026-01-28 07:06:31.024291834 +0000 UTC m=+962.439951994" Jan 28 07:06:31 crc kubenswrapper[4776]: I0128 07:06:31.028774 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e356c55c-adea-433d-9f03-a403f330b085","Type":"ContainerStarted","Data":"a09d4431a355ace609a5374be0152dc8e1ab1121a347ddc1ab8761ca89ddb4c8"} Jan 28 07:06:31 crc kubenswrapper[4776]: I0128 07:06:31.031608 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c544ad4a-db14-419a-b423-435e8416f597","Type":"ContainerStarted","Data":"00df2257a649f71215e07ba7ed61ff51fdbcfe1d66c980c13b5ccb5bd6f0511d"} Jan 28 07:06:31 crc kubenswrapper[4776]: I0128 07:06:31.039341 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-mbldl" podStartSLOduration=13.501715132 podStartE2EDuration="20.039323491s" podCreationTimestamp="2026-01-28 07:06:11 +0000 UTC" firstStartedPulling="2026-01-28 07:06:21.269336127 +0000 UTC m=+952.684996287" lastFinishedPulling="2026-01-28 07:06:27.806944486 +0000 UTC m=+959.222604646" observedRunningTime="2026-01-28 07:06:31.037266786 +0000 UTC m=+962.452926946" watchObservedRunningTime="2026-01-28 07:06:31.039323491 +0000 UTC m=+962.454983641" Jan 28 07:06:31 crc kubenswrapper[4776]: W0128 07:06:31.039792 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod088e8d0a_b7b2_48dc_9920_6d49987066a5.slice/crio-8a0647e2086a5f5f4e2879ff7ad5299f7d263399616deed3067a5a95237f30a9 WatchSource:0}: Error finding container 8a0647e2086a5f5f4e2879ff7ad5299f7d263399616deed3067a5a95237f30a9: Status 404 returned error can't find the container with id 8a0647e2086a5f5f4e2879ff7ad5299f7d263399616deed3067a5a95237f30a9 Jan 28 07:06:32 crc kubenswrapper[4776]: I0128 07:06:32.040219 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-59r62" event={"ID":"a390a227-8301-4ed3-80ee-06131089f499","Type":"ContainerStarted","Data":"b3c5818348a5f8d02abca09b273cd1298d9865a68c8f08c345b2cea4334b05e2"} Jan 28 07:06:32 crc kubenswrapper[4776]: I0128 07:06:32.042392 4776 generic.go:334] "Generic (PLEG): container finished" podID="088e8d0a-b7b2-48dc-9920-6d49987066a5" containerID="8e43912cb3586a92b65bc30255d56610973d5f0c8fcc7238f7145b2194955fd0" exitCode=0 Jan 28 07:06:32 crc kubenswrapper[4776]: I0128 07:06:32.042448 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" event={"ID":"088e8d0a-b7b2-48dc-9920-6d49987066a5","Type":"ContainerDied","Data":"8e43912cb3586a92b65bc30255d56610973d5f0c8fcc7238f7145b2194955fd0"} Jan 28 07:06:32 crc kubenswrapper[4776]: I0128 07:06:32.042464 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" event={"ID":"088e8d0a-b7b2-48dc-9920-6d49987066a5","Type":"ContainerStarted","Data":"8a0647e2086a5f5f4e2879ff7ad5299f7d263399616deed3067a5a95237f30a9"} Jan 28 07:06:32 crc kubenswrapper[4776]: I0128 07:06:32.044747 4776 generic.go:334] "Generic (PLEG): container finished" podID="a81879d5-73b4-4375-929a-9c7ebf311145" containerID="3021d1ef1ac9f05b7e03eb80352197390d85c8cbd36130d6a3f7c3b938f26c5d" exitCode=0 Jan 28 07:06:32 crc kubenswrapper[4776]: I0128 07:06:32.044904 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" event={"ID":"a81879d5-73b4-4375-929a-9c7ebf311145","Type":"ContainerDied","Data":"3021d1ef1ac9f05b7e03eb80352197390d85c8cbd36130d6a3f7c3b938f26c5d"} Jan 28 07:06:33 crc kubenswrapper[4776]: I0128 07:06:33.056286 4776 generic.go:334] "Generic (PLEG): container finished" podID="b485d028-58ae-46ec-afd9-720d1a05bade" containerID="a7395e416a36ac06e72819ad5c383900da5a9799ab8da6bfbce4c5320beab9a0" exitCode=0 Jan 28 07:06:33 crc kubenswrapper[4776]: I0128 07:06:33.056378 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b485d028-58ae-46ec-afd9-720d1a05bade","Type":"ContainerDied","Data":"a7395e416a36ac06e72819ad5c383900da5a9799ab8da6bfbce4c5320beab9a0"} Jan 28 07:06:33 crc kubenswrapper[4776]: I0128 07:06:33.182290 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" Jan 28 07:06:33 crc kubenswrapper[4776]: I0128 07:06:33.353598 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-dns-svc\") pod \"a81879d5-73b4-4375-929a-9c7ebf311145\" (UID: \"a81879d5-73b4-4375-929a-9c7ebf311145\") " Jan 28 07:06:33 crc kubenswrapper[4776]: I0128 07:06:33.353727 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-ovsdbserver-nb\") pod \"a81879d5-73b4-4375-929a-9c7ebf311145\" (UID: \"a81879d5-73b4-4375-929a-9c7ebf311145\") " Jan 28 07:06:33 crc kubenswrapper[4776]: I0128 07:06:33.353817 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spl25\" (UniqueName: \"kubernetes.io/projected/a81879d5-73b4-4375-929a-9c7ebf311145-kube-api-access-spl25\") pod \"a81879d5-73b4-4375-929a-9c7ebf311145\" (UID: \"a81879d5-73b4-4375-929a-9c7ebf311145\") " Jan 28 07:06:33 crc kubenswrapper[4776]: I0128 07:06:33.353978 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-config\") pod \"a81879d5-73b4-4375-929a-9c7ebf311145\" (UID: \"a81879d5-73b4-4375-929a-9c7ebf311145\") " Jan 28 07:06:33 crc kubenswrapper[4776]: I0128 07:06:33.370406 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a81879d5-73b4-4375-929a-9c7ebf311145-kube-api-access-spl25" (OuterVolumeSpecName: "kube-api-access-spl25") pod "a81879d5-73b4-4375-929a-9c7ebf311145" (UID: "a81879d5-73b4-4375-929a-9c7ebf311145"). InnerVolumeSpecName "kube-api-access-spl25". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:06:33 crc kubenswrapper[4776]: I0128 07:06:33.374379 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a81879d5-73b4-4375-929a-9c7ebf311145" (UID: "a81879d5-73b4-4375-929a-9c7ebf311145"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:33 crc kubenswrapper[4776]: I0128 07:06:33.378423 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-config" (OuterVolumeSpecName: "config") pod "a81879d5-73b4-4375-929a-9c7ebf311145" (UID: "a81879d5-73b4-4375-929a-9c7ebf311145"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:33 crc kubenswrapper[4776]: I0128 07:06:33.411042 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a81879d5-73b4-4375-929a-9c7ebf311145" (UID: "a81879d5-73b4-4375-929a-9c7ebf311145"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:33 crc kubenswrapper[4776]: I0128 07:06:33.456572 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:33 crc kubenswrapper[4776]: I0128 07:06:33.456869 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:33 crc kubenswrapper[4776]: I0128 07:06:33.456903 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spl25\" (UniqueName: \"kubernetes.io/projected/a81879d5-73b4-4375-929a-9c7ebf311145-kube-api-access-spl25\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:33 crc kubenswrapper[4776]: I0128 07:06:33.456919 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a81879d5-73b4-4375-929a-9c7ebf311145-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:34 crc kubenswrapper[4776]: I0128 07:06:34.066872 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b485d028-58ae-46ec-afd9-720d1a05bade","Type":"ContainerStarted","Data":"ea065a22a3b8df3fae70cba0206728530420c48b134306d2dbf58ecf5b7c8373"} Jan 28 07:06:34 crc kubenswrapper[4776]: I0128 07:06:34.069408 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" event={"ID":"088e8d0a-b7b2-48dc-9920-6d49987066a5","Type":"ContainerStarted","Data":"f7f28cfaa86b4b8ae053089817f82a635407b8e12d574cfcca4c038462dda0bc"} Jan 28 07:06:34 crc kubenswrapper[4776]: I0128 07:06:34.069574 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:34 crc kubenswrapper[4776]: I0128 07:06:34.071294 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" event={"ID":"a81879d5-73b4-4375-929a-9c7ebf311145","Type":"ContainerDied","Data":"4effb2fe2b680ddee9985f8807e468110415f93ba6b3c432fdef67a37fcb235c"} Jan 28 07:06:34 crc kubenswrapper[4776]: I0128 07:06:34.071428 4776 scope.go:117] "RemoveContainer" containerID="3021d1ef1ac9f05b7e03eb80352197390d85c8cbd36130d6a3f7c3b938f26c5d" Jan 28 07:06:34 crc kubenswrapper[4776]: I0128 07:06:34.071328 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-9hx9z" Jan 28 07:06:34 crc kubenswrapper[4776]: I0128 07:06:34.082288 4776 generic.go:334] "Generic (PLEG): container finished" podID="e46126d4-c96e-4d66-9a2e-7f6873a6a1dd" containerID="33f4a4d03c6d8a13ed210359215cb04e251a765a95d7dc559081682cc20f68b9" exitCode=0 Jan 28 07:06:34 crc kubenswrapper[4776]: I0128 07:06:34.082411 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd","Type":"ContainerDied","Data":"33f4a4d03c6d8a13ed210359215cb04e251a765a95d7dc559081682cc20f68b9"} Jan 28 07:06:34 crc kubenswrapper[4776]: I0128 07:06:34.102270 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=21.48100349 podStartE2EDuration="31.102248471s" podCreationTimestamp="2026-01-28 07:06:03 +0000 UTC" firstStartedPulling="2026-01-28 07:06:17.947637127 +0000 UTC m=+949.363297287" lastFinishedPulling="2026-01-28 07:06:27.568882118 +0000 UTC m=+958.984542268" observedRunningTime="2026-01-28 07:06:34.100489873 +0000 UTC m=+965.516150033" watchObservedRunningTime="2026-01-28 07:06:34.102248471 +0000 UTC m=+965.517908631" Jan 28 07:06:34 crc kubenswrapper[4776]: I0128 07:06:34.124652 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" podStartSLOduration=5.124633239 podStartE2EDuration="5.124633239s" podCreationTimestamp="2026-01-28 07:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:06:34.123635721 +0000 UTC m=+965.539295891" watchObservedRunningTime="2026-01-28 07:06:34.124633239 +0000 UTC m=+965.540293399" Jan 28 07:06:34 crc kubenswrapper[4776]: I0128 07:06:34.358695 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-9hx9z"] Jan 28 07:06:34 crc kubenswrapper[4776]: I0128 07:06:34.364429 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-9hx9z"] Jan 28 07:06:35 crc kubenswrapper[4776]: I0128 07:06:35.104807 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e356c55c-adea-433d-9f03-a403f330b085","Type":"ContainerStarted","Data":"53bdf464c1b2774839d07096e15b51861959961e63c4fcbb5f373347ce32f0c7"} Jan 28 07:06:35 crc kubenswrapper[4776]: I0128 07:06:35.111492 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e46126d4-c96e-4d66-9a2e-7f6873a6a1dd","Type":"ContainerStarted","Data":"9a5f15efbc28958eb944e27ead9076b9b6a88f25d3ba830f17074a13c2459544"} Jan 28 07:06:35 crc kubenswrapper[4776]: I0128 07:06:35.114356 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fd4193d3-abf1-457c-a774-de938b12b909","Type":"ContainerStarted","Data":"51e2f5f4c752eba607eac180a04e2eeaa3a29f84b99deeb4c3461e26741e0586"} Jan 28 07:06:35 crc kubenswrapper[4776]: I0128 07:06:35.118009 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-59r62" event={"ID":"a390a227-8301-4ed3-80ee-06131089f499","Type":"ContainerStarted","Data":"f7743b9aa8ab6a4567222ba6d520c9f040b2937f0793c75668702855496cbc94"} Jan 28 07:06:35 crc kubenswrapper[4776]: I0128 07:06:35.132443 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=7.235126234 podStartE2EDuration="23.132418507s" podCreationTimestamp="2026-01-28 07:06:12 +0000 UTC" firstStartedPulling="2026-01-28 07:06:18.112238913 +0000 UTC m=+949.527899073" lastFinishedPulling="2026-01-28 07:06:34.009531176 +0000 UTC m=+965.425191346" observedRunningTime="2026-01-28 07:06:35.128137561 +0000 UTC m=+966.543797761" watchObservedRunningTime="2026-01-28 07:06:35.132418507 +0000 UTC m=+966.548078707" Jan 28 07:06:35 crc kubenswrapper[4776]: I0128 07:06:35.167734 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=9.133170348 podStartE2EDuration="25.167713244s" podCreationTimestamp="2026-01-28 07:06:10 +0000 UTC" firstStartedPulling="2026-01-28 07:06:18.021346587 +0000 UTC m=+949.437006747" lastFinishedPulling="2026-01-28 07:06:34.055889483 +0000 UTC m=+965.471549643" observedRunningTime="2026-01-28 07:06:35.151403682 +0000 UTC m=+966.567063882" watchObservedRunningTime="2026-01-28 07:06:35.167713244 +0000 UTC m=+966.583373424" Jan 28 07:06:35 crc kubenswrapper[4776]: I0128 07:06:35.181648 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.665090085 podStartE2EDuration="33.181631142s" podCreationTimestamp="2026-01-28 07:06:02 +0000 UTC" firstStartedPulling="2026-01-28 07:06:17.405936003 +0000 UTC m=+948.821596163" lastFinishedPulling="2026-01-28 07:06:27.92247706 +0000 UTC m=+959.338137220" observedRunningTime="2026-01-28 07:06:35.178094386 +0000 UTC m=+966.593754556" watchObservedRunningTime="2026-01-28 07:06:35.181631142 +0000 UTC m=+966.597291302" Jan 28 07:06:35 crc kubenswrapper[4776]: I0128 07:06:35.231816 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-59r62" podStartSLOduration=4.25208384 podStartE2EDuration="7.231784342s" podCreationTimestamp="2026-01-28 07:06:28 +0000 UTC" firstStartedPulling="2026-01-28 07:06:31.05252239 +0000 UTC m=+962.468182550" lastFinishedPulling="2026-01-28 07:06:34.032222902 +0000 UTC m=+965.447883052" observedRunningTime="2026-01-28 07:06:35.199859587 +0000 UTC m=+966.615519747" watchObservedRunningTime="2026-01-28 07:06:35.231784342 +0000 UTC m=+966.647444542" Jan 28 07:06:35 crc kubenswrapper[4776]: I0128 07:06:35.234667 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:35 crc kubenswrapper[4776]: I0128 07:06:35.290721 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:35 crc kubenswrapper[4776]: I0128 07:06:35.290767 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:35 crc kubenswrapper[4776]: I0128 07:06:35.303320 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:35 crc kubenswrapper[4776]: I0128 07:06:35.318129 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a81879d5-73b4-4375-929a-9c7ebf311145" path="/var/lib/kubelet/pods/a81879d5-73b4-4375-929a-9c7ebf311145/volumes" Jan 28 07:06:35 crc kubenswrapper[4776]: I0128 07:06:35.575378 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.068256 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.109575 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.124445 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.125265 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.161768 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.168248 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.415748 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 28 07:06:36 crc kubenswrapper[4776]: E0128 07:06:36.416081 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81879d5-73b4-4375-929a-9c7ebf311145" containerName="init" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.416097 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81879d5-73b4-4375-929a-9c7ebf311145" containerName="init" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.416283 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81879d5-73b4-4375-929a-9c7ebf311145" containerName="init" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.417109 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.419255 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.419276 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.419291 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-2pfmx" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.419760 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.431534 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.514946 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aff78447-a04f-4c5b-871f-3b47df7325c8-scripts\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.514998 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff78447-a04f-4c5b-871f-3b47df7325c8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.515026 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aff78447-a04f-4c5b-871f-3b47df7325c8-config\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.515199 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff78447-a04f-4c5b-871f-3b47df7325c8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.515262 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7rn7\" (UniqueName: \"kubernetes.io/projected/aff78447-a04f-4c5b-871f-3b47df7325c8-kube-api-access-s7rn7\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.515289 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aff78447-a04f-4c5b-871f-3b47df7325c8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.515413 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff78447-a04f-4c5b-871f-3b47df7325c8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.616331 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7rn7\" (UniqueName: \"kubernetes.io/projected/aff78447-a04f-4c5b-871f-3b47df7325c8-kube-api-access-s7rn7\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.616416 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aff78447-a04f-4c5b-871f-3b47df7325c8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.616451 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff78447-a04f-4c5b-871f-3b47df7325c8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.616479 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aff78447-a04f-4c5b-871f-3b47df7325c8-scripts\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.616498 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff78447-a04f-4c5b-871f-3b47df7325c8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.616524 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aff78447-a04f-4c5b-871f-3b47df7325c8-config\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.616579 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff78447-a04f-4c5b-871f-3b47df7325c8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.617059 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aff78447-a04f-4c5b-871f-3b47df7325c8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.617430 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aff78447-a04f-4c5b-871f-3b47df7325c8-scripts\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.617906 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aff78447-a04f-4c5b-871f-3b47df7325c8-config\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.641007 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff78447-a04f-4c5b-871f-3b47df7325c8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.645304 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff78447-a04f-4c5b-871f-3b47df7325c8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.647315 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff78447-a04f-4c5b-871f-3b47df7325c8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.651292 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7rn7\" (UniqueName: \"kubernetes.io/projected/aff78447-a04f-4c5b-871f-3b47df7325c8-kube-api-access-s7rn7\") pod \"ovn-northd-0\" (UID: \"aff78447-a04f-4c5b-871f-3b47df7325c8\") " pod="openstack/ovn-northd-0" Jan 28 07:06:36 crc kubenswrapper[4776]: I0128 07:06:36.736477 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.139583 4776 generic.go:334] "Generic (PLEG): container finished" podID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerID="35c90678e12bea8f8da79ced725be71f64fcdb713c8b62dacebabb5f70075b29" exitCode=0 Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.139650 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"544a6c48-5eb5-42f0-a46a-0a726d213341","Type":"ContainerDied","Data":"35c90678e12bea8f8da79ced725be71f64fcdb713c8b62dacebabb5f70075b29"} Jan 28 07:06:37 crc kubenswrapper[4776]: W0128 07:06:37.187104 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaff78447_a04f_4c5b_871f_3b47df7325c8.slice/crio-5e1178a5f83e4288c4fb2b70e04cafafba6bdef12c1eef6c17dfe351e8977d55 WatchSource:0}: Error finding container 5e1178a5f83e4288c4fb2b70e04cafafba6bdef12c1eef6c17dfe351e8977d55: Status 404 returned error can't find the container with id 5e1178a5f83e4288c4fb2b70e04cafafba6bdef12c1eef6c17dfe351e8977d55 Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.198679 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.432314 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.446001 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2hz7d"] Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.446208 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" podUID="088e8d0a-b7b2-48dc-9920-6d49987066a5" containerName="dnsmasq-dns" containerID="cri-o://f7f28cfaa86b4b8ae053089817f82a635407b8e12d574cfcca4c038462dda0bc" gracePeriod=10 Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.490278 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-z7ngz"] Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.492755 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.537892 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-dns-svc\") pod \"dnsmasq-dns-698758b865-z7ngz\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.537971 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-z7ngz\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.538032 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbz8q\" (UniqueName: \"kubernetes.io/projected/12e40a7e-8ea9-4135-9ec1-3904792273aa-kube-api-access-vbz8q\") pod \"dnsmasq-dns-698758b865-z7ngz\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.538116 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-z7ngz\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.538172 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-config\") pod \"dnsmasq-dns-698758b865-z7ngz\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.549781 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-z7ngz"] Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.646674 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-z7ngz\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.646784 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbz8q\" (UniqueName: \"kubernetes.io/projected/12e40a7e-8ea9-4135-9ec1-3904792273aa-kube-api-access-vbz8q\") pod \"dnsmasq-dns-698758b865-z7ngz\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.646832 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-z7ngz\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.646903 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-config\") pod \"dnsmasq-dns-698758b865-z7ngz\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.646947 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-dns-svc\") pod \"dnsmasq-dns-698758b865-z7ngz\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.647996 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-dns-svc\") pod \"dnsmasq-dns-698758b865-z7ngz\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.648687 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-z7ngz\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.648859 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-z7ngz\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.649343 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-config\") pod \"dnsmasq-dns-698758b865-z7ngz\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.670062 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbz8q\" (UniqueName: \"kubernetes.io/projected/12e40a7e-8ea9-4135-9ec1-3904792273aa-kube-api-access-vbz8q\") pod \"dnsmasq-dns-698758b865-z7ngz\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.758037 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.862086 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:37 crc kubenswrapper[4776]: I0128 07:06:37.869116 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.042275 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.155298 4776 generic.go:334] "Generic (PLEG): container finished" podID="088e8d0a-b7b2-48dc-9920-6d49987066a5" containerID="f7f28cfaa86b4b8ae053089817f82a635407b8e12d574cfcca4c038462dda0bc" exitCode=0 Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.155360 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" event={"ID":"088e8d0a-b7b2-48dc-9920-6d49987066a5","Type":"ContainerDied","Data":"f7f28cfaa86b4b8ae053089817f82a635407b8e12d574cfcca4c038462dda0bc"} Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.155386 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" event={"ID":"088e8d0a-b7b2-48dc-9920-6d49987066a5","Type":"ContainerDied","Data":"8a0647e2086a5f5f4e2879ff7ad5299f7d263399616deed3067a5a95237f30a9"} Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.155408 4776 scope.go:117] "RemoveContainer" containerID="f7f28cfaa86b4b8ae053089817f82a635407b8e12d574cfcca4c038462dda0bc" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.155530 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2hz7d" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.157459 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-ovsdbserver-nb\") pod \"088e8d0a-b7b2-48dc-9920-6d49987066a5\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.157564 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-ovsdbserver-sb\") pod \"088e8d0a-b7b2-48dc-9920-6d49987066a5\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.157636 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-dns-svc\") pod \"088e8d0a-b7b2-48dc-9920-6d49987066a5\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.157692 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q5gg\" (UniqueName: \"kubernetes.io/projected/088e8d0a-b7b2-48dc-9920-6d49987066a5-kube-api-access-4q5gg\") pod \"088e8d0a-b7b2-48dc-9920-6d49987066a5\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.157727 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-config\") pod \"088e8d0a-b7b2-48dc-9920-6d49987066a5\" (UID: \"088e8d0a-b7b2-48dc-9920-6d49987066a5\") " Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.162769 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/088e8d0a-b7b2-48dc-9920-6d49987066a5-kube-api-access-4q5gg" (OuterVolumeSpecName: "kube-api-access-4q5gg") pod "088e8d0a-b7b2-48dc-9920-6d49987066a5" (UID: "088e8d0a-b7b2-48dc-9920-6d49987066a5"). InnerVolumeSpecName "kube-api-access-4q5gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.165213 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"aff78447-a04f-4c5b-871f-3b47df7325c8","Type":"ContainerStarted","Data":"5e1178a5f83e4288c4fb2b70e04cafafba6bdef12c1eef6c17dfe351e8977d55"} Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.189983 4776 scope.go:117] "RemoveContainer" containerID="8e43912cb3586a92b65bc30255d56610973d5f0c8fcc7238f7145b2194955fd0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.212437 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-config" (OuterVolumeSpecName: "config") pod "088e8d0a-b7b2-48dc-9920-6d49987066a5" (UID: "088e8d0a-b7b2-48dc-9920-6d49987066a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.212868 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "088e8d0a-b7b2-48dc-9920-6d49987066a5" (UID: "088e8d0a-b7b2-48dc-9920-6d49987066a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.217700 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "088e8d0a-b7b2-48dc-9920-6d49987066a5" (UID: "088e8d0a-b7b2-48dc-9920-6d49987066a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.219646 4776 scope.go:117] "RemoveContainer" containerID="f7f28cfaa86b4b8ae053089817f82a635407b8e12d574cfcca4c038462dda0bc" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.219714 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "088e8d0a-b7b2-48dc-9920-6d49987066a5" (UID: "088e8d0a-b7b2-48dc-9920-6d49987066a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:38 crc kubenswrapper[4776]: E0128 07:06:38.219999 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7f28cfaa86b4b8ae053089817f82a635407b8e12d574cfcca4c038462dda0bc\": container with ID starting with f7f28cfaa86b4b8ae053089817f82a635407b8e12d574cfcca4c038462dda0bc not found: ID does not exist" containerID="f7f28cfaa86b4b8ae053089817f82a635407b8e12d574cfcca4c038462dda0bc" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.220034 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7f28cfaa86b4b8ae053089817f82a635407b8e12d574cfcca4c038462dda0bc"} err="failed to get container status \"f7f28cfaa86b4b8ae053089817f82a635407b8e12d574cfcca4c038462dda0bc\": rpc error: code = NotFound desc = could not find container \"f7f28cfaa86b4b8ae053089817f82a635407b8e12d574cfcca4c038462dda0bc\": container with ID starting with f7f28cfaa86b4b8ae053089817f82a635407b8e12d574cfcca4c038462dda0bc not found: ID does not exist" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.220069 4776 scope.go:117] "RemoveContainer" containerID="8e43912cb3586a92b65bc30255d56610973d5f0c8fcc7238f7145b2194955fd0" Jan 28 07:06:38 crc kubenswrapper[4776]: E0128 07:06:38.220265 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e43912cb3586a92b65bc30255d56610973d5f0c8fcc7238f7145b2194955fd0\": container with ID starting with 8e43912cb3586a92b65bc30255d56610973d5f0c8fcc7238f7145b2194955fd0 not found: ID does not exist" containerID="8e43912cb3586a92b65bc30255d56610973d5f0c8fcc7238f7145b2194955fd0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.220294 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e43912cb3586a92b65bc30255d56610973d5f0c8fcc7238f7145b2194955fd0"} err="failed to get container status \"8e43912cb3586a92b65bc30255d56610973d5f0c8fcc7238f7145b2194955fd0\": rpc error: code = NotFound desc = could not find container \"8e43912cb3586a92b65bc30255d56610973d5f0c8fcc7238f7145b2194955fd0\": container with ID starting with 8e43912cb3586a92b65bc30255d56610973d5f0c8fcc7238f7145b2194955fd0 not found: ID does not exist" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.259268 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.259294 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q5gg\" (UniqueName: \"kubernetes.io/projected/088e8d0a-b7b2-48dc-9920-6d49987066a5-kube-api-access-4q5gg\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.259306 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.259315 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.259324 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/088e8d0a-b7b2-48dc-9920-6d49987066a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.338894 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-z7ngz"] Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.505208 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2hz7d"] Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.512778 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2hz7d"] Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.593730 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 28 07:06:38 crc kubenswrapper[4776]: E0128 07:06:38.594120 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="088e8d0a-b7b2-48dc-9920-6d49987066a5" containerName="init" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.594133 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="088e8d0a-b7b2-48dc-9920-6d49987066a5" containerName="init" Jan 28 07:06:38 crc kubenswrapper[4776]: E0128 07:06:38.594149 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="088e8d0a-b7b2-48dc-9920-6d49987066a5" containerName="dnsmasq-dns" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.594157 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="088e8d0a-b7b2-48dc-9920-6d49987066a5" containerName="dnsmasq-dns" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.594307 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="088e8d0a-b7b2-48dc-9920-6d49987066a5" containerName="dnsmasq-dns" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.599658 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.601846 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.602918 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.603049 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-mnz9l" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.603180 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.621681 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.668161 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.668230 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/331ac509-cce0-4545-ac41-1224aae65295-cache\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.668264 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.668288 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/331ac509-cce0-4545-ac41-1224aae65295-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.668350 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbtjr\" (UniqueName: \"kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-kube-api-access-rbtjr\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.668378 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/331ac509-cce0-4545-ac41-1224aae65295-lock\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.770403 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/331ac509-cce0-4545-ac41-1224aae65295-lock\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.770496 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.770537 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/331ac509-cce0-4545-ac41-1224aae65295-cache\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.770575 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.770593 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/331ac509-cce0-4545-ac41-1224aae65295-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.770643 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbtjr\" (UniqueName: \"kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-kube-api-access-rbtjr\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: E0128 07:06:38.771003 4776 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.771077 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: E0128 07:06:38.771088 4776 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 07:06:38 crc kubenswrapper[4776]: E0128 07:06:38.771252 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift podName:331ac509-cce0-4545-ac41-1224aae65295 nodeName:}" failed. No retries permitted until 2026-01-28 07:06:39.271233328 +0000 UTC m=+970.686893488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift") pod "swift-storage-0" (UID: "331ac509-cce0-4545-ac41-1224aae65295") : configmap "swift-ring-files" not found Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.771260 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/331ac509-cce0-4545-ac41-1224aae65295-lock\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.771629 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/331ac509-cce0-4545-ac41-1224aae65295-cache\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.788588 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/331ac509-cce0-4545-ac41-1224aae65295-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.803794 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:38 crc kubenswrapper[4776]: I0128 07:06:38.804668 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbtjr\" (UniqueName: \"kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-kube-api-access-rbtjr\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:39 crc kubenswrapper[4776]: I0128 07:06:39.174020 4776 generic.go:334] "Generic (PLEG): container finished" podID="12e40a7e-8ea9-4135-9ec1-3904792273aa" containerID="e19299793f5af9f7ad66a0415eda83c2e7623f7ea3699ca96a249adc07707741" exitCode=0 Jan 28 07:06:39 crc kubenswrapper[4776]: I0128 07:06:39.174061 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-z7ngz" event={"ID":"12e40a7e-8ea9-4135-9ec1-3904792273aa","Type":"ContainerDied","Data":"e19299793f5af9f7ad66a0415eda83c2e7623f7ea3699ca96a249adc07707741"} Jan 28 07:06:39 crc kubenswrapper[4776]: I0128 07:06:39.174084 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-z7ngz" event={"ID":"12e40a7e-8ea9-4135-9ec1-3904792273aa","Type":"ContainerStarted","Data":"db10803f5b3508d5557efa2032ff045f1cd5dfa95317cb966efdc9ad0130caa4"} Jan 28 07:06:39 crc kubenswrapper[4776]: I0128 07:06:39.302208 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:39 crc kubenswrapper[4776]: E0128 07:06:39.302754 4776 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 07:06:39 crc kubenswrapper[4776]: E0128 07:06:39.302793 4776 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 07:06:39 crc kubenswrapper[4776]: E0128 07:06:39.302858 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift podName:331ac509-cce0-4545-ac41-1224aae65295 nodeName:}" failed. No retries permitted until 2026-01-28 07:06:40.302833709 +0000 UTC m=+971.718493939 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift") pod "swift-storage-0" (UID: "331ac509-cce0-4545-ac41-1224aae65295") : configmap "swift-ring-files" not found Jan 28 07:06:39 crc kubenswrapper[4776]: I0128 07:06:39.323095 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="088e8d0a-b7b2-48dc-9920-6d49987066a5" path="/var/lib/kubelet/pods/088e8d0a-b7b2-48dc-9920-6d49987066a5/volumes" Jan 28 07:06:40 crc kubenswrapper[4776]: I0128 07:06:40.184660 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-z7ngz" event={"ID":"12e40a7e-8ea9-4135-9ec1-3904792273aa","Type":"ContainerStarted","Data":"f4394b9a8079a15ae6383b73702a9057bef9dc53788ffa5b52761fb10f4d85d0"} Jan 28 07:06:40 crc kubenswrapper[4776]: I0128 07:06:40.185659 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:40 crc kubenswrapper[4776]: I0128 07:06:40.189648 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"aff78447-a04f-4c5b-871f-3b47df7325c8","Type":"ContainerStarted","Data":"990247e8bb69afee5acaa3fb80b6b7db33e7f4bc630f3f1d0c776ea439a5f0d3"} Jan 28 07:06:40 crc kubenswrapper[4776]: I0128 07:06:40.189679 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"aff78447-a04f-4c5b-871f-3b47df7325c8","Type":"ContainerStarted","Data":"6189c984af019e4b52244266586eadc54766aea4f9247cf637da021382b206f0"} Jan 28 07:06:40 crc kubenswrapper[4776]: I0128 07:06:40.189862 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 28 07:06:40 crc kubenswrapper[4776]: I0128 07:06:40.204895 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-z7ngz" podStartSLOduration=3.204881719 podStartE2EDuration="3.204881719s" podCreationTimestamp="2026-01-28 07:06:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:06:40.202789792 +0000 UTC m=+971.618449962" watchObservedRunningTime="2026-01-28 07:06:40.204881719 +0000 UTC m=+971.620541879" Jan 28 07:06:40 crc kubenswrapper[4776]: I0128 07:06:40.224221 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.053430947 podStartE2EDuration="4.224199544s" podCreationTimestamp="2026-01-28 07:06:36 +0000 UTC" firstStartedPulling="2026-01-28 07:06:37.190070066 +0000 UTC m=+968.605730216" lastFinishedPulling="2026-01-28 07:06:39.360838653 +0000 UTC m=+970.776498813" observedRunningTime="2026-01-28 07:06:40.220898874 +0000 UTC m=+971.636559034" watchObservedRunningTime="2026-01-28 07:06:40.224199544 +0000 UTC m=+971.639859694" Jan 28 07:06:40 crc kubenswrapper[4776]: I0128 07:06:40.321147 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:40 crc kubenswrapper[4776]: E0128 07:06:40.321333 4776 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 07:06:40 crc kubenswrapper[4776]: E0128 07:06:40.321363 4776 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 07:06:40 crc kubenswrapper[4776]: E0128 07:06:40.321421 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift podName:331ac509-cce0-4545-ac41-1224aae65295 nodeName:}" failed. No retries permitted until 2026-01-28 07:06:42.321402361 +0000 UTC m=+973.737062611 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift") pod "swift-storage-0" (UID: "331ac509-cce0-4545-ac41-1224aae65295") : configmap "swift-ring-files" not found Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.355906 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:42 crc kubenswrapper[4776]: E0128 07:06:42.357373 4776 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 07:06:42 crc kubenswrapper[4776]: E0128 07:06:42.357390 4776 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 07:06:42 crc kubenswrapper[4776]: E0128 07:06:42.357427 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift podName:331ac509-cce0-4545-ac41-1224aae65295 nodeName:}" failed. No retries permitted until 2026-01-28 07:06:46.35741383 +0000 UTC m=+977.773073990 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift") pod "swift-storage-0" (UID: "331ac509-cce0-4545-ac41-1224aae65295") : configmap "swift-ring-files" not found Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.511112 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-cmmlx"] Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.512117 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.518450 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.518878 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.518978 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.533277 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-cmmlx"] Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.563570 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-scripts\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.563620 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-etc-swift\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.563696 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-ring-data-devices\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.563751 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htc6p\" (UniqueName: \"kubernetes.io/projected/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-kube-api-access-htc6p\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.563772 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-dispersionconf\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.563796 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-swiftconf\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.563833 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-combined-ca-bundle\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.665090 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-etc-swift\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.665155 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-ring-data-devices\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.665205 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htc6p\" (UniqueName: \"kubernetes.io/projected/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-kube-api-access-htc6p\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.665229 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-dispersionconf\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.665260 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-swiftconf\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.665326 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-combined-ca-bundle\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.665434 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-scripts\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.665680 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-etc-swift\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.666131 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-ring-data-devices\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.666592 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-scripts\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.676140 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-dispersionconf\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.676184 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-swiftconf\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.676320 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-combined-ca-bundle\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.691294 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htc6p\" (UniqueName: \"kubernetes.io/projected/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-kube-api-access-htc6p\") pod \"swift-ring-rebalance-cmmlx\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:42 crc kubenswrapper[4776]: I0128 07:06:42.826143 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:06:43 crc kubenswrapper[4776]: I0128 07:06:43.274443 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-cmmlx"] Jan 28 07:06:43 crc kubenswrapper[4776]: I0128 07:06:43.966132 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-qw6h6"] Jan 28 07:06:43 crc kubenswrapper[4776]: I0128 07:06:43.967488 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qw6h6" Jan 28 07:06:43 crc kubenswrapper[4776]: I0128 07:06:43.971604 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 28 07:06:43 crc kubenswrapper[4776]: I0128 07:06:43.979021 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qw6h6"] Jan 28 07:06:44 crc kubenswrapper[4776]: I0128 07:06:44.057538 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 28 07:06:44 crc kubenswrapper[4776]: I0128 07:06:44.057642 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 28 07:06:44 crc kubenswrapper[4776]: I0128 07:06:44.103170 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjhh6\" (UniqueName: \"kubernetes.io/projected/0c8c8852-9160-4156-aa38-7f8660a42c9c-kube-api-access-qjhh6\") pod \"root-account-create-update-qw6h6\" (UID: \"0c8c8852-9160-4156-aa38-7f8660a42c9c\") " pod="openstack/root-account-create-update-qw6h6" Jan 28 07:06:44 crc kubenswrapper[4776]: I0128 07:06:44.103371 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c8c8852-9160-4156-aa38-7f8660a42c9c-operator-scripts\") pod \"root-account-create-update-qw6h6\" (UID: \"0c8c8852-9160-4156-aa38-7f8660a42c9c\") " pod="openstack/root-account-create-update-qw6h6" Jan 28 07:06:44 crc kubenswrapper[4776]: I0128 07:06:44.138763 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 28 07:06:44 crc kubenswrapper[4776]: I0128 07:06:44.204959 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c8c8852-9160-4156-aa38-7f8660a42c9c-operator-scripts\") pod \"root-account-create-update-qw6h6\" (UID: \"0c8c8852-9160-4156-aa38-7f8660a42c9c\") " pod="openstack/root-account-create-update-qw6h6" Jan 28 07:06:44 crc kubenswrapper[4776]: I0128 07:06:44.205114 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjhh6\" (UniqueName: \"kubernetes.io/projected/0c8c8852-9160-4156-aa38-7f8660a42c9c-kube-api-access-qjhh6\") pod \"root-account-create-update-qw6h6\" (UID: \"0c8c8852-9160-4156-aa38-7f8660a42c9c\") " pod="openstack/root-account-create-update-qw6h6" Jan 28 07:06:44 crc kubenswrapper[4776]: I0128 07:06:44.206045 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c8c8852-9160-4156-aa38-7f8660a42c9c-operator-scripts\") pod \"root-account-create-update-qw6h6\" (UID: \"0c8c8852-9160-4156-aa38-7f8660a42c9c\") " pod="openstack/root-account-create-update-qw6h6" Jan 28 07:06:44 crc kubenswrapper[4776]: I0128 07:06:44.245349 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjhh6\" (UniqueName: \"kubernetes.io/projected/0c8c8852-9160-4156-aa38-7f8660a42c9c-kube-api-access-qjhh6\") pod \"root-account-create-update-qw6h6\" (UID: \"0c8c8852-9160-4156-aa38-7f8660a42c9c\") " pod="openstack/root-account-create-update-qw6h6" Jan 28 07:06:44 crc kubenswrapper[4776]: I0128 07:06:44.301626 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qw6h6" Jan 28 07:06:44 crc kubenswrapper[4776]: I0128 07:06:44.330100 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.188444 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-fjb7k"] Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.190103 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fjb7k" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.208147 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fjb7k"] Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.318527 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-87a7-account-create-update-f8927"] Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.319943 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-87a7-account-create-update-f8927" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.324746 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.326742 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-87a7-account-create-update-f8927"] Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.328532 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c92552a8-f85e-44d3-9b6f-d3614a6bc92a-operator-scripts\") pod \"keystone-db-create-fjb7k\" (UID: \"c92552a8-f85e-44d3-9b6f-d3614a6bc92a\") " pod="openstack/keystone-db-create-fjb7k" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.328674 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s86pb\" (UniqueName: \"kubernetes.io/projected/c92552a8-f85e-44d3-9b6f-d3614a6bc92a-kube-api-access-s86pb\") pod \"keystone-db-create-fjb7k\" (UID: \"c92552a8-f85e-44d3-9b6f-d3614a6bc92a\") " pod="openstack/keystone-db-create-fjb7k" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.433945 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56e1e400-5c65-433a-a52e-01160edeb76a-operator-scripts\") pod \"keystone-87a7-account-create-update-f8927\" (UID: \"56e1e400-5c65-433a-a52e-01160edeb76a\") " pod="openstack/keystone-87a7-account-create-update-f8927" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.434029 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c92552a8-f85e-44d3-9b6f-d3614a6bc92a-operator-scripts\") pod \"keystone-db-create-fjb7k\" (UID: \"c92552a8-f85e-44d3-9b6f-d3614a6bc92a\") " pod="openstack/keystone-db-create-fjb7k" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.434064 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s86pb\" (UniqueName: \"kubernetes.io/projected/c92552a8-f85e-44d3-9b6f-d3614a6bc92a-kube-api-access-s86pb\") pod \"keystone-db-create-fjb7k\" (UID: \"c92552a8-f85e-44d3-9b6f-d3614a6bc92a\") " pod="openstack/keystone-db-create-fjb7k" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.434080 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmlv5\" (UniqueName: \"kubernetes.io/projected/56e1e400-5c65-433a-a52e-01160edeb76a-kube-api-access-wmlv5\") pod \"keystone-87a7-account-create-update-f8927\" (UID: \"56e1e400-5c65-433a-a52e-01160edeb76a\") " pod="openstack/keystone-87a7-account-create-update-f8927" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.434834 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c92552a8-f85e-44d3-9b6f-d3614a6bc92a-operator-scripts\") pod \"keystone-db-create-fjb7k\" (UID: \"c92552a8-f85e-44d3-9b6f-d3614a6bc92a\") " pod="openstack/keystone-db-create-fjb7k" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.468078 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s86pb\" (UniqueName: \"kubernetes.io/projected/c92552a8-f85e-44d3-9b6f-d3614a6bc92a-kube-api-access-s86pb\") pod \"keystone-db-create-fjb7k\" (UID: \"c92552a8-f85e-44d3-9b6f-d3614a6bc92a\") " pod="openstack/keystone-db-create-fjb7k" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.499087 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-294lg"] Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.503147 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-294lg" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.511801 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-294lg"] Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.516751 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fjb7k" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.534980 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56e1e400-5c65-433a-a52e-01160edeb76a-operator-scripts\") pod \"keystone-87a7-account-create-update-f8927\" (UID: \"56e1e400-5c65-433a-a52e-01160edeb76a\") " pod="openstack/keystone-87a7-account-create-update-f8927" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.535060 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmlv5\" (UniqueName: \"kubernetes.io/projected/56e1e400-5c65-433a-a52e-01160edeb76a-kube-api-access-wmlv5\") pod \"keystone-87a7-account-create-update-f8927\" (UID: \"56e1e400-5c65-433a-a52e-01160edeb76a\") " pod="openstack/keystone-87a7-account-create-update-f8927" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.536007 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56e1e400-5c65-433a-a52e-01160edeb76a-operator-scripts\") pod \"keystone-87a7-account-create-update-f8927\" (UID: \"56e1e400-5c65-433a-a52e-01160edeb76a\") " pod="openstack/keystone-87a7-account-create-update-f8927" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.550807 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmlv5\" (UniqueName: \"kubernetes.io/projected/56e1e400-5c65-433a-a52e-01160edeb76a-kube-api-access-wmlv5\") pod \"keystone-87a7-account-create-update-f8927\" (UID: \"56e1e400-5c65-433a-a52e-01160edeb76a\") " pod="openstack/keystone-87a7-account-create-update-f8927" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.627270 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-fc6c-account-create-update-h8spp"] Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.628907 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc6c-account-create-update-h8spp" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.630843 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.635400 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fc6c-account-create-update-h8spp"] Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.636444 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6adc8ae9-dfbe-4c04-b844-b2fb424bda14-operator-scripts\") pod \"placement-db-create-294lg\" (UID: \"6adc8ae9-dfbe-4c04-b844-b2fb424bda14\") " pod="openstack/placement-db-create-294lg" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.636509 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkppl\" (UniqueName: \"kubernetes.io/projected/6adc8ae9-dfbe-4c04-b844-b2fb424bda14-kube-api-access-qkppl\") pod \"placement-db-create-294lg\" (UID: \"6adc8ae9-dfbe-4c04-b844-b2fb424bda14\") " pod="openstack/placement-db-create-294lg" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.642380 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-87a7-account-create-update-f8927" Jan 28 07:06:45 crc kubenswrapper[4776]: W0128 07:06:45.718388 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda83b6bd4_3813_465a_aa62_8bb029d2fcc0.slice/crio-f1c080a0dcb1f594b8715810431d3503f72f1a88d14fa5a50df651d8b0daa02b WatchSource:0}: Error finding container f1c080a0dcb1f594b8715810431d3503f72f1a88d14fa5a50df651d8b0daa02b: Status 404 returned error can't find the container with id f1c080a0dcb1f594b8715810431d3503f72f1a88d14fa5a50df651d8b0daa02b Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.733428 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-9rmhk"] Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.734773 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9rmhk" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.738219 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgj8g\" (UniqueName: \"kubernetes.io/projected/6d2eae23-aa55-4a27-a54c-b58d28da7b56-kube-api-access-sgj8g\") pod \"placement-fc6c-account-create-update-h8spp\" (UID: \"6d2eae23-aa55-4a27-a54c-b58d28da7b56\") " pod="openstack/placement-fc6c-account-create-update-h8spp" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.738430 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d2eae23-aa55-4a27-a54c-b58d28da7b56-operator-scripts\") pod \"placement-fc6c-account-create-update-h8spp\" (UID: \"6d2eae23-aa55-4a27-a54c-b58d28da7b56\") " pod="openstack/placement-fc6c-account-create-update-h8spp" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.738494 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6adc8ae9-dfbe-4c04-b844-b2fb424bda14-operator-scripts\") pod \"placement-db-create-294lg\" (UID: \"6adc8ae9-dfbe-4c04-b844-b2fb424bda14\") " pod="openstack/placement-db-create-294lg" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.738567 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkppl\" (UniqueName: \"kubernetes.io/projected/6adc8ae9-dfbe-4c04-b844-b2fb424bda14-kube-api-access-qkppl\") pod \"placement-db-create-294lg\" (UID: \"6adc8ae9-dfbe-4c04-b844-b2fb424bda14\") " pod="openstack/placement-db-create-294lg" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.743141 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9rmhk"] Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.747983 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6adc8ae9-dfbe-4c04-b844-b2fb424bda14-operator-scripts\") pod \"placement-db-create-294lg\" (UID: \"6adc8ae9-dfbe-4c04-b844-b2fb424bda14\") " pod="openstack/placement-db-create-294lg" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.770684 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkppl\" (UniqueName: \"kubernetes.io/projected/6adc8ae9-dfbe-4c04-b844-b2fb424bda14-kube-api-access-qkppl\") pod \"placement-db-create-294lg\" (UID: \"6adc8ae9-dfbe-4c04-b844-b2fb424bda14\") " pod="openstack/placement-db-create-294lg" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.817972 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-d25e-account-create-update-txwtx"] Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.819399 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d25e-account-create-update-txwtx" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.821420 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.824715 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d25e-account-create-update-txwtx"] Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.825800 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-294lg" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.841112 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgj8g\" (UniqueName: \"kubernetes.io/projected/6d2eae23-aa55-4a27-a54c-b58d28da7b56-kube-api-access-sgj8g\") pod \"placement-fc6c-account-create-update-h8spp\" (UID: \"6d2eae23-aa55-4a27-a54c-b58d28da7b56\") " pod="openstack/placement-fc6c-account-create-update-h8spp" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.841211 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c623f99-9905-4a7e-addc-022e48fb40bf-operator-scripts\") pod \"glance-db-create-9rmhk\" (UID: \"5c623f99-9905-4a7e-addc-022e48fb40bf\") " pod="openstack/glance-db-create-9rmhk" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.841245 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frvq2\" (UniqueName: \"kubernetes.io/projected/5c623f99-9905-4a7e-addc-022e48fb40bf-kube-api-access-frvq2\") pod \"glance-db-create-9rmhk\" (UID: \"5c623f99-9905-4a7e-addc-022e48fb40bf\") " pod="openstack/glance-db-create-9rmhk" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.841305 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d2eae23-aa55-4a27-a54c-b58d28da7b56-operator-scripts\") pod \"placement-fc6c-account-create-update-h8spp\" (UID: \"6d2eae23-aa55-4a27-a54c-b58d28da7b56\") " pod="openstack/placement-fc6c-account-create-update-h8spp" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.842231 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d2eae23-aa55-4a27-a54c-b58d28da7b56-operator-scripts\") pod \"placement-fc6c-account-create-update-h8spp\" (UID: \"6d2eae23-aa55-4a27-a54c-b58d28da7b56\") " pod="openstack/placement-fc6c-account-create-update-h8spp" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.855779 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgj8g\" (UniqueName: \"kubernetes.io/projected/6d2eae23-aa55-4a27-a54c-b58d28da7b56-kube-api-access-sgj8g\") pod \"placement-fc6c-account-create-update-h8spp\" (UID: \"6d2eae23-aa55-4a27-a54c-b58d28da7b56\") " pod="openstack/placement-fc6c-account-create-update-h8spp" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.942924 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rptwv\" (UniqueName: \"kubernetes.io/projected/a8bcf822-d4ff-4cb2-b783-cea86d58ab8e-kube-api-access-rptwv\") pod \"glance-d25e-account-create-update-txwtx\" (UID: \"a8bcf822-d4ff-4cb2-b783-cea86d58ab8e\") " pod="openstack/glance-d25e-account-create-update-txwtx" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.943061 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8bcf822-d4ff-4cb2-b783-cea86d58ab8e-operator-scripts\") pod \"glance-d25e-account-create-update-txwtx\" (UID: \"a8bcf822-d4ff-4cb2-b783-cea86d58ab8e\") " pod="openstack/glance-d25e-account-create-update-txwtx" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.943166 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c623f99-9905-4a7e-addc-022e48fb40bf-operator-scripts\") pod \"glance-db-create-9rmhk\" (UID: \"5c623f99-9905-4a7e-addc-022e48fb40bf\") " pod="openstack/glance-db-create-9rmhk" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.943257 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frvq2\" (UniqueName: \"kubernetes.io/projected/5c623f99-9905-4a7e-addc-022e48fb40bf-kube-api-access-frvq2\") pod \"glance-db-create-9rmhk\" (UID: \"5c623f99-9905-4a7e-addc-022e48fb40bf\") " pod="openstack/glance-db-create-9rmhk" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.944206 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c623f99-9905-4a7e-addc-022e48fb40bf-operator-scripts\") pod \"glance-db-create-9rmhk\" (UID: \"5c623f99-9905-4a7e-addc-022e48fb40bf\") " pod="openstack/glance-db-create-9rmhk" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.947162 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc6c-account-create-update-h8spp" Jan 28 07:06:45 crc kubenswrapper[4776]: I0128 07:06:45.970378 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frvq2\" (UniqueName: \"kubernetes.io/projected/5c623f99-9905-4a7e-addc-022e48fb40bf-kube-api-access-frvq2\") pod \"glance-db-create-9rmhk\" (UID: \"5c623f99-9905-4a7e-addc-022e48fb40bf\") " pod="openstack/glance-db-create-9rmhk" Jan 28 07:06:46 crc kubenswrapper[4776]: I0128 07:06:46.045038 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rptwv\" (UniqueName: \"kubernetes.io/projected/a8bcf822-d4ff-4cb2-b783-cea86d58ab8e-kube-api-access-rptwv\") pod \"glance-d25e-account-create-update-txwtx\" (UID: \"a8bcf822-d4ff-4cb2-b783-cea86d58ab8e\") " pod="openstack/glance-d25e-account-create-update-txwtx" Jan 28 07:06:46 crc kubenswrapper[4776]: I0128 07:06:46.045096 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8bcf822-d4ff-4cb2-b783-cea86d58ab8e-operator-scripts\") pod \"glance-d25e-account-create-update-txwtx\" (UID: \"a8bcf822-d4ff-4cb2-b783-cea86d58ab8e\") " pod="openstack/glance-d25e-account-create-update-txwtx" Jan 28 07:06:46 crc kubenswrapper[4776]: I0128 07:06:46.046196 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8bcf822-d4ff-4cb2-b783-cea86d58ab8e-operator-scripts\") pod \"glance-d25e-account-create-update-txwtx\" (UID: \"a8bcf822-d4ff-4cb2-b783-cea86d58ab8e\") " pod="openstack/glance-d25e-account-create-update-txwtx" Jan 28 07:06:46 crc kubenswrapper[4776]: I0128 07:06:46.073152 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rptwv\" (UniqueName: \"kubernetes.io/projected/a8bcf822-d4ff-4cb2-b783-cea86d58ab8e-kube-api-access-rptwv\") pod \"glance-d25e-account-create-update-txwtx\" (UID: \"a8bcf822-d4ff-4cb2-b783-cea86d58ab8e\") " pod="openstack/glance-d25e-account-create-update-txwtx" Jan 28 07:06:46 crc kubenswrapper[4776]: I0128 07:06:46.177690 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9rmhk" Jan 28 07:06:46 crc kubenswrapper[4776]: I0128 07:06:46.192669 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d25e-account-create-update-txwtx" Jan 28 07:06:46 crc kubenswrapper[4776]: I0128 07:06:46.249389 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cmmlx" event={"ID":"a83b6bd4-3813-465a-aa62-8bb029d2fcc0","Type":"ContainerStarted","Data":"f1c080a0dcb1f594b8715810431d3503f72f1a88d14fa5a50df651d8b0daa02b"} Jan 28 07:06:46 crc kubenswrapper[4776]: I0128 07:06:46.260142 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"544a6c48-5eb5-42f0-a46a-0a726d213341","Type":"ContainerStarted","Data":"88bc359bfc0f27f92955825e4558579ce8d0019689935da73440c43347421876"} Jan 28 07:06:46 crc kubenswrapper[4776]: I0128 07:06:46.270130 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qw6h6"] Jan 28 07:06:46 crc kubenswrapper[4776]: W0128 07:06:46.301324 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c8c8852_9160_4156_aa38_7f8660a42c9c.slice/crio-a1d222bcf2885ef8fd76fd31336d5211c0b03a8db14aa76853abeca97b5688af WatchSource:0}: Error finding container a1d222bcf2885ef8fd76fd31336d5211c0b03a8db14aa76853abeca97b5688af: Status 404 returned error can't find the container with id a1d222bcf2885ef8fd76fd31336d5211c0b03a8db14aa76853abeca97b5688af Jan 28 07:06:46 crc kubenswrapper[4776]: I0128 07:06:46.357788 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:46 crc kubenswrapper[4776]: E0128 07:06:46.357901 4776 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 07:06:46 crc kubenswrapper[4776]: E0128 07:06:46.358008 4776 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 07:06:46 crc kubenswrapper[4776]: E0128 07:06:46.358053 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift podName:331ac509-cce0-4545-ac41-1224aae65295 nodeName:}" failed. No retries permitted until 2026-01-28 07:06:54.358036774 +0000 UTC m=+985.773696944 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift") pod "swift-storage-0" (UID: "331ac509-cce0-4545-ac41-1224aae65295") : configmap "swift-ring-files" not found Jan 28 07:06:46 crc kubenswrapper[4776]: I0128 07:06:46.386109 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-87a7-account-create-update-f8927"] Jan 28 07:06:46 crc kubenswrapper[4776]: I0128 07:06:46.403605 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fjb7k"] Jan 28 07:06:46 crc kubenswrapper[4776]: W0128 07:06:46.414613 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc92552a8_f85e_44d3_9b6f_d3614a6bc92a.slice/crio-7a3330f0f461486f1fea81837f815f2b8d7e2dff7e5fff9d73cb19ba5f384fd2 WatchSource:0}: Error finding container 7a3330f0f461486f1fea81837f815f2b8d7e2dff7e5fff9d73cb19ba5f384fd2: Status 404 returned error can't find the container with id 7a3330f0f461486f1fea81837f815f2b8d7e2dff7e5fff9d73cb19ba5f384fd2 Jan 28 07:06:46 crc kubenswrapper[4776]: I0128 07:06:46.523681 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-294lg"] Jan 28 07:06:46 crc kubenswrapper[4776]: I0128 07:06:46.538369 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fc6c-account-create-update-h8spp"] Jan 28 07:06:46 crc kubenswrapper[4776]: I0128 07:06:46.696481 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9rmhk"] Jan 28 07:06:46 crc kubenswrapper[4776]: I0128 07:06:46.702642 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d25e-account-create-update-txwtx"] Jan 28 07:06:46 crc kubenswrapper[4776]: W0128 07:06:46.746383 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c623f99_9905_4a7e_addc_022e48fb40bf.slice/crio-ee44d8be9f5a30cfacd9839b33a841972ba6b2dea1e2c3ff5b6fad365a10b238 WatchSource:0}: Error finding container ee44d8be9f5a30cfacd9839b33a841972ba6b2dea1e2c3ff5b6fad365a10b238: Status 404 returned error can't find the container with id ee44d8be9f5a30cfacd9839b33a841972ba6b2dea1e2c3ff5b6fad365a10b238 Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.271293 4776 generic.go:334] "Generic (PLEG): container finished" podID="0c8c8852-9160-4156-aa38-7f8660a42c9c" containerID="27b39f6875088c8f69a97f8aaf7dd625d73dbbec0abbf0730e71e0872238c7d8" exitCode=0 Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.271365 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qw6h6" event={"ID":"0c8c8852-9160-4156-aa38-7f8660a42c9c","Type":"ContainerDied","Data":"27b39f6875088c8f69a97f8aaf7dd625d73dbbec0abbf0730e71e0872238c7d8"} Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.271390 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qw6h6" event={"ID":"0c8c8852-9160-4156-aa38-7f8660a42c9c","Type":"ContainerStarted","Data":"a1d222bcf2885ef8fd76fd31336d5211c0b03a8db14aa76853abeca97b5688af"} Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.272985 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-87a7-account-create-update-f8927" event={"ID":"56e1e400-5c65-433a-a52e-01160edeb76a","Type":"ContainerStarted","Data":"f1c6280e1c6756604e7d1fe0f44835aca83672edbe2e48eb3d530dd99a371d51"} Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.273033 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-87a7-account-create-update-f8927" event={"ID":"56e1e400-5c65-433a-a52e-01160edeb76a","Type":"ContainerStarted","Data":"52a987935866c7e5391400c7835284c488ed653ef68c0e38000da74fd9e654d0"} Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.282668 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d25e-account-create-update-txwtx" event={"ID":"a8bcf822-d4ff-4cb2-b783-cea86d58ab8e","Type":"ContainerStarted","Data":"19d3d242da2d3f8c71f6cb97924c41d39a325eb3503fcbf41ae5fb60daf59627"} Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.284889 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9rmhk" event={"ID":"5c623f99-9905-4a7e-addc-022e48fb40bf","Type":"ContainerStarted","Data":"ee44d8be9f5a30cfacd9839b33a841972ba6b2dea1e2c3ff5b6fad365a10b238"} Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.289240 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc6c-account-create-update-h8spp" event={"ID":"6d2eae23-aa55-4a27-a54c-b58d28da7b56","Type":"ContainerStarted","Data":"c69eb5fe3d965dc3069a35c58841cf4530b29ae95d561ab053761c41e24e4665"} Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.289302 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc6c-account-create-update-h8spp" event={"ID":"6d2eae23-aa55-4a27-a54c-b58d28da7b56","Type":"ContainerStarted","Data":"e17751dc10eb15abe7a9276481a55b400f41f8f27fa3cf96196c0862c4328d1f"} Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.292942 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fjb7k" event={"ID":"c92552a8-f85e-44d3-9b6f-d3614a6bc92a","Type":"ContainerStarted","Data":"7c311c8d46ebc562fde60e40bb783e27af644bbe89d573f240b1b41301091aef"} Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.292978 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fjb7k" event={"ID":"c92552a8-f85e-44d3-9b6f-d3614a6bc92a","Type":"ContainerStarted","Data":"7a3330f0f461486f1fea81837f815f2b8d7e2dff7e5fff9d73cb19ba5f384fd2"} Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.294444 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-294lg" event={"ID":"6adc8ae9-dfbe-4c04-b844-b2fb424bda14","Type":"ContainerStarted","Data":"5d35f38428bebe58aae6212e4749e16570cae904aea598445fdbe420e23d0047"} Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.304824 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-87a7-account-create-update-f8927" podStartSLOduration=2.304807393 podStartE2EDuration="2.304807393s" podCreationTimestamp="2026-01-28 07:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:06:47.299393296 +0000 UTC m=+978.715053456" watchObservedRunningTime="2026-01-28 07:06:47.304807393 +0000 UTC m=+978.720467553" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.332090 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-fjb7k" podStartSLOduration=2.332071094 podStartE2EDuration="2.332071094s" podCreationTimestamp="2026-01-28 07:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:06:47.320109329 +0000 UTC m=+978.735769529" watchObservedRunningTime="2026-01-28 07:06:47.332071094 +0000 UTC m=+978.747731274" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.334711 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-fc6c-account-create-update-h8spp" podStartSLOduration=2.334698396 podStartE2EDuration="2.334698396s" podCreationTimestamp="2026-01-28 07:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:06:47.332249699 +0000 UTC m=+978.747909869" watchObservedRunningTime="2026-01-28 07:06:47.334698396 +0000 UTC m=+978.750358566" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.452784 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-9c8kt"] Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.454934 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-9c8kt" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.474635 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-9c8kt"] Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.560062 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-1f0e-account-create-update-rrjzl"] Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.561777 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-1f0e-account-create-update-rrjzl" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.565029 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.584881 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-1f0e-account-create-update-rrjzl"] Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.597988 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2s7h\" (UniqueName: \"kubernetes.io/projected/5c4e7e46-a88a-44c2-8679-550be504407e-kube-api-access-f2s7h\") pod \"watcher-db-create-9c8kt\" (UID: \"5c4e7e46-a88a-44c2-8679-550be504407e\") " pod="openstack/watcher-db-create-9c8kt" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.598056 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c4e7e46-a88a-44c2-8679-550be504407e-operator-scripts\") pod \"watcher-db-create-9c8kt\" (UID: \"5c4e7e46-a88a-44c2-8679-550be504407e\") " pod="openstack/watcher-db-create-9c8kt" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.699107 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2s7h\" (UniqueName: \"kubernetes.io/projected/5c4e7e46-a88a-44c2-8679-550be504407e-kube-api-access-f2s7h\") pod \"watcher-db-create-9c8kt\" (UID: \"5c4e7e46-a88a-44c2-8679-550be504407e\") " pod="openstack/watcher-db-create-9c8kt" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.699162 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a015c10-32ee-47e7-a5f4-21d5ccbe6e89-operator-scripts\") pod \"watcher-1f0e-account-create-update-rrjzl\" (UID: \"5a015c10-32ee-47e7-a5f4-21d5ccbe6e89\") " pod="openstack/watcher-1f0e-account-create-update-rrjzl" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.699190 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c4e7e46-a88a-44c2-8679-550be504407e-operator-scripts\") pod \"watcher-db-create-9c8kt\" (UID: \"5c4e7e46-a88a-44c2-8679-550be504407e\") " pod="openstack/watcher-db-create-9c8kt" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.699410 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49qvl\" (UniqueName: \"kubernetes.io/projected/5a015c10-32ee-47e7-a5f4-21d5ccbe6e89-kube-api-access-49qvl\") pod \"watcher-1f0e-account-create-update-rrjzl\" (UID: \"5a015c10-32ee-47e7-a5f4-21d5ccbe6e89\") " pod="openstack/watcher-1f0e-account-create-update-rrjzl" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.700514 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c4e7e46-a88a-44c2-8679-550be504407e-operator-scripts\") pod \"watcher-db-create-9c8kt\" (UID: \"5c4e7e46-a88a-44c2-8679-550be504407e\") " pod="openstack/watcher-db-create-9c8kt" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.732706 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2s7h\" (UniqueName: \"kubernetes.io/projected/5c4e7e46-a88a-44c2-8679-550be504407e-kube-api-access-f2s7h\") pod \"watcher-db-create-9c8kt\" (UID: \"5c4e7e46-a88a-44c2-8679-550be504407e\") " pod="openstack/watcher-db-create-9c8kt" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.800634 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49qvl\" (UniqueName: \"kubernetes.io/projected/5a015c10-32ee-47e7-a5f4-21d5ccbe6e89-kube-api-access-49qvl\") pod \"watcher-1f0e-account-create-update-rrjzl\" (UID: \"5a015c10-32ee-47e7-a5f4-21d5ccbe6e89\") " pod="openstack/watcher-1f0e-account-create-update-rrjzl" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.800825 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a015c10-32ee-47e7-a5f4-21d5ccbe6e89-operator-scripts\") pod \"watcher-1f0e-account-create-update-rrjzl\" (UID: \"5a015c10-32ee-47e7-a5f4-21d5ccbe6e89\") " pod="openstack/watcher-1f0e-account-create-update-rrjzl" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.801476 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a015c10-32ee-47e7-a5f4-21d5ccbe6e89-operator-scripts\") pod \"watcher-1f0e-account-create-update-rrjzl\" (UID: \"5a015c10-32ee-47e7-a5f4-21d5ccbe6e89\") " pod="openstack/watcher-1f0e-account-create-update-rrjzl" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.815834 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49qvl\" (UniqueName: \"kubernetes.io/projected/5a015c10-32ee-47e7-a5f4-21d5ccbe6e89-kube-api-access-49qvl\") pod \"watcher-1f0e-account-create-update-rrjzl\" (UID: \"5a015c10-32ee-47e7-a5f4-21d5ccbe6e89\") " pod="openstack/watcher-1f0e-account-create-update-rrjzl" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.848655 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-9c8kt" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.863677 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.882768 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-1f0e-account-create-update-rrjzl" Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.918150 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t7dww"] Jan 28 07:06:47 crc kubenswrapper[4776]: I0128 07:06:47.918743 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" podUID="e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc" containerName="dnsmasq-dns" containerID="cri-o://0479e45791f23fc97d375c97d7fa3c7220435eae00d8033bdb9fd35e46fabe0e" gracePeriod=10 Jan 28 07:06:48 crc kubenswrapper[4776]: I0128 07:06:48.306509 4776 generic.go:334] "Generic (PLEG): container finished" podID="6adc8ae9-dfbe-4c04-b844-b2fb424bda14" containerID="4f40bcd3111dd5a48ad78e3666f60f367d1016eb6651d7d637e676fb1c5677a8" exitCode=0 Jan 28 07:06:48 crc kubenswrapper[4776]: I0128 07:06:48.306594 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-294lg" event={"ID":"6adc8ae9-dfbe-4c04-b844-b2fb424bda14","Type":"ContainerDied","Data":"4f40bcd3111dd5a48ad78e3666f60f367d1016eb6651d7d637e676fb1c5677a8"} Jan 28 07:06:48 crc kubenswrapper[4776]: I0128 07:06:48.325772 4776 generic.go:334] "Generic (PLEG): container finished" podID="56e1e400-5c65-433a-a52e-01160edeb76a" containerID="f1c6280e1c6756604e7d1fe0f44835aca83672edbe2e48eb3d530dd99a371d51" exitCode=0 Jan 28 07:06:48 crc kubenswrapper[4776]: I0128 07:06:48.325871 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-87a7-account-create-update-f8927" event={"ID":"56e1e400-5c65-433a-a52e-01160edeb76a","Type":"ContainerDied","Data":"f1c6280e1c6756604e7d1fe0f44835aca83672edbe2e48eb3d530dd99a371d51"} Jan 28 07:06:48 crc kubenswrapper[4776]: I0128 07:06:48.327482 4776 generic.go:334] "Generic (PLEG): container finished" podID="a8bcf822-d4ff-4cb2-b783-cea86d58ab8e" containerID="8fba4c3d1b4dfd12be137fcdca0b52c484e61f02dc4676663dfd49f415cabc4b" exitCode=0 Jan 28 07:06:48 crc kubenswrapper[4776]: I0128 07:06:48.327525 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d25e-account-create-update-txwtx" event={"ID":"a8bcf822-d4ff-4cb2-b783-cea86d58ab8e","Type":"ContainerDied","Data":"8fba4c3d1b4dfd12be137fcdca0b52c484e61f02dc4676663dfd49f415cabc4b"} Jan 28 07:06:48 crc kubenswrapper[4776]: I0128 07:06:48.341240 4776 generic.go:334] "Generic (PLEG): container finished" podID="5c623f99-9905-4a7e-addc-022e48fb40bf" containerID="6a3eea09d446cfa1d0df82cc2e18518c23710a2669ad38f164b55ee1fbd660f7" exitCode=0 Jan 28 07:06:48 crc kubenswrapper[4776]: I0128 07:06:48.341311 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9rmhk" event={"ID":"5c623f99-9905-4a7e-addc-022e48fb40bf","Type":"ContainerDied","Data":"6a3eea09d446cfa1d0df82cc2e18518c23710a2669ad38f164b55ee1fbd660f7"} Jan 28 07:06:48 crc kubenswrapper[4776]: I0128 07:06:48.357358 4776 generic.go:334] "Generic (PLEG): container finished" podID="e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc" containerID="0479e45791f23fc97d375c97d7fa3c7220435eae00d8033bdb9fd35e46fabe0e" exitCode=0 Jan 28 07:06:48 crc kubenswrapper[4776]: I0128 07:06:48.357420 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" event={"ID":"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc","Type":"ContainerDied","Data":"0479e45791f23fc97d375c97d7fa3c7220435eae00d8033bdb9fd35e46fabe0e"} Jan 28 07:06:48 crc kubenswrapper[4776]: I0128 07:06:48.358432 4776 generic.go:334] "Generic (PLEG): container finished" podID="6d2eae23-aa55-4a27-a54c-b58d28da7b56" containerID="c69eb5fe3d965dc3069a35c58841cf4530b29ae95d561ab053761c41e24e4665" exitCode=0 Jan 28 07:06:48 crc kubenswrapper[4776]: I0128 07:06:48.358474 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc6c-account-create-update-h8spp" event={"ID":"6d2eae23-aa55-4a27-a54c-b58d28da7b56","Type":"ContainerDied","Data":"c69eb5fe3d965dc3069a35c58841cf4530b29ae95d561ab053761c41e24e4665"} Jan 28 07:06:48 crc kubenswrapper[4776]: I0128 07:06:48.359385 4776 generic.go:334] "Generic (PLEG): container finished" podID="c92552a8-f85e-44d3-9b6f-d3614a6bc92a" containerID="7c311c8d46ebc562fde60e40bb783e27af644bbe89d573f240b1b41301091aef" exitCode=0 Jan 28 07:06:48 crc kubenswrapper[4776]: I0128 07:06:48.359421 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fjb7k" event={"ID":"c92552a8-f85e-44d3-9b6f-d3614a6bc92a","Type":"ContainerDied","Data":"7c311c8d46ebc562fde60e40bb783e27af644bbe89d573f240b1b41301091aef"} Jan 28 07:06:49 crc kubenswrapper[4776]: I0128 07:06:49.384616 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"544a6c48-5eb5-42f0-a46a-0a726d213341","Type":"ContainerStarted","Data":"00e723df3061d416830cea707b8efa99babb8de270569f1ff9c43c05bfba0967"} Jan 28 07:06:51 crc kubenswrapper[4776]: I0128 07:06:51.668300 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" podUID="e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.102:5353: connect: connection refused" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.033986 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-87a7-account-create-update-f8927" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.038818 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qw6h6" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.044181 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc6c-account-create-update-h8spp" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.063126 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-294lg" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.069893 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fjb7k" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.099950 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9rmhk" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.133742 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmlv5\" (UniqueName: \"kubernetes.io/projected/56e1e400-5c65-433a-a52e-01160edeb76a-kube-api-access-wmlv5\") pod \"56e1e400-5c65-433a-a52e-01160edeb76a\" (UID: \"56e1e400-5c65-433a-a52e-01160edeb76a\") " Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.134308 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s86pb\" (UniqueName: \"kubernetes.io/projected/c92552a8-f85e-44d3-9b6f-d3614a6bc92a-kube-api-access-s86pb\") pod \"c92552a8-f85e-44d3-9b6f-d3614a6bc92a\" (UID: \"c92552a8-f85e-44d3-9b6f-d3614a6bc92a\") " Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.134339 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjhh6\" (UniqueName: \"kubernetes.io/projected/0c8c8852-9160-4156-aa38-7f8660a42c9c-kube-api-access-qjhh6\") pod \"0c8c8852-9160-4156-aa38-7f8660a42c9c\" (UID: \"0c8c8852-9160-4156-aa38-7f8660a42c9c\") " Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.134755 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkppl\" (UniqueName: \"kubernetes.io/projected/6adc8ae9-dfbe-4c04-b844-b2fb424bda14-kube-api-access-qkppl\") pod \"6adc8ae9-dfbe-4c04-b844-b2fb424bda14\" (UID: \"6adc8ae9-dfbe-4c04-b844-b2fb424bda14\") " Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.134789 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgj8g\" (UniqueName: \"kubernetes.io/projected/6d2eae23-aa55-4a27-a54c-b58d28da7b56-kube-api-access-sgj8g\") pod \"6d2eae23-aa55-4a27-a54c-b58d28da7b56\" (UID: \"6d2eae23-aa55-4a27-a54c-b58d28da7b56\") " Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.134805 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c92552a8-f85e-44d3-9b6f-d3614a6bc92a-operator-scripts\") pod \"c92552a8-f85e-44d3-9b6f-d3614a6bc92a\" (UID: \"c92552a8-f85e-44d3-9b6f-d3614a6bc92a\") " Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.134865 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c8c8852-9160-4156-aa38-7f8660a42c9c-operator-scripts\") pod \"0c8c8852-9160-4156-aa38-7f8660a42c9c\" (UID: \"0c8c8852-9160-4156-aa38-7f8660a42c9c\") " Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.134927 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d2eae23-aa55-4a27-a54c-b58d28da7b56-operator-scripts\") pod \"6d2eae23-aa55-4a27-a54c-b58d28da7b56\" (UID: \"6d2eae23-aa55-4a27-a54c-b58d28da7b56\") " Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.135098 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6adc8ae9-dfbe-4c04-b844-b2fb424bda14-operator-scripts\") pod \"6adc8ae9-dfbe-4c04-b844-b2fb424bda14\" (UID: \"6adc8ae9-dfbe-4c04-b844-b2fb424bda14\") " Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.135127 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56e1e400-5c65-433a-a52e-01160edeb76a-operator-scripts\") pod \"56e1e400-5c65-433a-a52e-01160edeb76a\" (UID: \"56e1e400-5c65-433a-a52e-01160edeb76a\") " Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.135703 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c92552a8-f85e-44d3-9b6f-d3614a6bc92a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c92552a8-f85e-44d3-9b6f-d3614a6bc92a" (UID: "c92552a8-f85e-44d3-9b6f-d3614a6bc92a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.135990 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56e1e400-5c65-433a-a52e-01160edeb76a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "56e1e400-5c65-433a-a52e-01160edeb76a" (UID: "56e1e400-5c65-433a-a52e-01160edeb76a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.135992 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c92552a8-f85e-44d3-9b6f-d3614a6bc92a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.136262 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6adc8ae9-dfbe-4c04-b844-b2fb424bda14-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6adc8ae9-dfbe-4c04-b844-b2fb424bda14" (UID: "6adc8ae9-dfbe-4c04-b844-b2fb424bda14"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.136340 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d2eae23-aa55-4a27-a54c-b58d28da7b56-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d2eae23-aa55-4a27-a54c-b58d28da7b56" (UID: "6d2eae23-aa55-4a27-a54c-b58d28da7b56"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.138644 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c8c8852-9160-4156-aa38-7f8660a42c9c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c8c8852-9160-4156-aa38-7f8660a42c9c" (UID: "0c8c8852-9160-4156-aa38-7f8660a42c9c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.138680 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c92552a8-f85e-44d3-9b6f-d3614a6bc92a-kube-api-access-s86pb" (OuterVolumeSpecName: "kube-api-access-s86pb") pod "c92552a8-f85e-44d3-9b6f-d3614a6bc92a" (UID: "c92552a8-f85e-44d3-9b6f-d3614a6bc92a"). InnerVolumeSpecName "kube-api-access-s86pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.139098 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c8c8852-9160-4156-aa38-7f8660a42c9c-kube-api-access-qjhh6" (OuterVolumeSpecName: "kube-api-access-qjhh6") pod "0c8c8852-9160-4156-aa38-7f8660a42c9c" (UID: "0c8c8852-9160-4156-aa38-7f8660a42c9c"). InnerVolumeSpecName "kube-api-access-qjhh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.139400 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d2eae23-aa55-4a27-a54c-b58d28da7b56-kube-api-access-sgj8g" (OuterVolumeSpecName: "kube-api-access-sgj8g") pod "6d2eae23-aa55-4a27-a54c-b58d28da7b56" (UID: "6d2eae23-aa55-4a27-a54c-b58d28da7b56"). InnerVolumeSpecName "kube-api-access-sgj8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.140054 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6adc8ae9-dfbe-4c04-b844-b2fb424bda14-kube-api-access-qkppl" (OuterVolumeSpecName: "kube-api-access-qkppl") pod "6adc8ae9-dfbe-4c04-b844-b2fb424bda14" (UID: "6adc8ae9-dfbe-4c04-b844-b2fb424bda14"). InnerVolumeSpecName "kube-api-access-qkppl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.142016 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56e1e400-5c65-433a-a52e-01160edeb76a-kube-api-access-wmlv5" (OuterVolumeSpecName: "kube-api-access-wmlv5") pod "56e1e400-5c65-433a-a52e-01160edeb76a" (UID: "56e1e400-5c65-433a-a52e-01160edeb76a"). InnerVolumeSpecName "kube-api-access-wmlv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.179766 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d25e-account-create-update-txwtx" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.237198 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c623f99-9905-4a7e-addc-022e48fb40bf-operator-scripts\") pod \"5c623f99-9905-4a7e-addc-022e48fb40bf\" (UID: \"5c623f99-9905-4a7e-addc-022e48fb40bf\") " Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.237601 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rptwv\" (UniqueName: \"kubernetes.io/projected/a8bcf822-d4ff-4cb2-b783-cea86d58ab8e-kube-api-access-rptwv\") pod \"a8bcf822-d4ff-4cb2-b783-cea86d58ab8e\" (UID: \"a8bcf822-d4ff-4cb2-b783-cea86d58ab8e\") " Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.237656 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8bcf822-d4ff-4cb2-b783-cea86d58ab8e-operator-scripts\") pod \"a8bcf822-d4ff-4cb2-b783-cea86d58ab8e\" (UID: \"a8bcf822-d4ff-4cb2-b783-cea86d58ab8e\") " Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.237686 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frvq2\" (UniqueName: \"kubernetes.io/projected/5c623f99-9905-4a7e-addc-022e48fb40bf-kube-api-access-frvq2\") pod \"5c623f99-9905-4a7e-addc-022e48fb40bf\" (UID: \"5c623f99-9905-4a7e-addc-022e48fb40bf\") " Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.238039 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s86pb\" (UniqueName: \"kubernetes.io/projected/c92552a8-f85e-44d3-9b6f-d3614a6bc92a-kube-api-access-s86pb\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.238058 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjhh6\" (UniqueName: \"kubernetes.io/projected/0c8c8852-9160-4156-aa38-7f8660a42c9c-kube-api-access-qjhh6\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.238067 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkppl\" (UniqueName: \"kubernetes.io/projected/6adc8ae9-dfbe-4c04-b844-b2fb424bda14-kube-api-access-qkppl\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.238076 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgj8g\" (UniqueName: \"kubernetes.io/projected/6d2eae23-aa55-4a27-a54c-b58d28da7b56-kube-api-access-sgj8g\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.238086 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c8c8852-9160-4156-aa38-7f8660a42c9c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.238095 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d2eae23-aa55-4a27-a54c-b58d28da7b56-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.238103 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6adc8ae9-dfbe-4c04-b844-b2fb424bda14-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.238117 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56e1e400-5c65-433a-a52e-01160edeb76a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.238126 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmlv5\" (UniqueName: \"kubernetes.io/projected/56e1e400-5c65-433a-a52e-01160edeb76a-kube-api-access-wmlv5\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.239135 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8bcf822-d4ff-4cb2-b783-cea86d58ab8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a8bcf822-d4ff-4cb2-b783-cea86d58ab8e" (UID: "a8bcf822-d4ff-4cb2-b783-cea86d58ab8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.239141 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c623f99-9905-4a7e-addc-022e48fb40bf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c623f99-9905-4a7e-addc-022e48fb40bf" (UID: "5c623f99-9905-4a7e-addc-022e48fb40bf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.241997 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8bcf822-d4ff-4cb2-b783-cea86d58ab8e-kube-api-access-rptwv" (OuterVolumeSpecName: "kube-api-access-rptwv") pod "a8bcf822-d4ff-4cb2-b783-cea86d58ab8e" (UID: "a8bcf822-d4ff-4cb2-b783-cea86d58ab8e"). InnerVolumeSpecName "kube-api-access-rptwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.242092 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c623f99-9905-4a7e-addc-022e48fb40bf-kube-api-access-frvq2" (OuterVolumeSpecName: "kube-api-access-frvq2") pod "5c623f99-9905-4a7e-addc-022e48fb40bf" (UID: "5c623f99-9905-4a7e-addc-022e48fb40bf"). InnerVolumeSpecName "kube-api-access-frvq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.264022 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.338726 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-dns-svc\") pod \"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc\" (UID: \"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc\") " Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.338783 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-config\") pod \"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc\" (UID: \"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc\") " Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.338821 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dlzx\" (UniqueName: \"kubernetes.io/projected/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-kube-api-access-5dlzx\") pod \"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc\" (UID: \"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc\") " Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.339420 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rptwv\" (UniqueName: \"kubernetes.io/projected/a8bcf822-d4ff-4cb2-b783-cea86d58ab8e-kube-api-access-rptwv\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.339468 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8bcf822-d4ff-4cb2-b783-cea86d58ab8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.339478 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frvq2\" (UniqueName: \"kubernetes.io/projected/5c623f99-9905-4a7e-addc-022e48fb40bf-kube-api-access-frvq2\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.339508 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c623f99-9905-4a7e-addc-022e48fb40bf-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.346717 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-kube-api-access-5dlzx" (OuterVolumeSpecName: "kube-api-access-5dlzx") pod "e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc" (UID: "e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc"). InnerVolumeSpecName "kube-api-access-5dlzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.373950 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-config" (OuterVolumeSpecName: "config") pod "e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc" (UID: "e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.384296 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc" (UID: "e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.422637 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-87a7-account-create-update-f8927" event={"ID":"56e1e400-5c65-433a-a52e-01160edeb76a","Type":"ContainerDied","Data":"52a987935866c7e5391400c7835284c488ed653ef68c0e38000da74fd9e654d0"} Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.422687 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52a987935866c7e5391400c7835284c488ed653ef68c0e38000da74fd9e654d0" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.422647 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-87a7-account-create-update-f8927" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.424711 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d25e-account-create-update-txwtx" event={"ID":"a8bcf822-d4ff-4cb2-b783-cea86d58ab8e","Type":"ContainerDied","Data":"19d3d242da2d3f8c71f6cb97924c41d39a325eb3503fcbf41ae5fb60daf59627"} Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.424728 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19d3d242da2d3f8c71f6cb97924c41d39a325eb3503fcbf41ae5fb60daf59627" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.424750 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d25e-account-create-update-txwtx" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.429045 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" event={"ID":"e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc","Type":"ContainerDied","Data":"c53e7202e8a836549fb5e1d23d37fc42a4204a7854db3aa16f63b1f150a46711"} Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.429083 4776 scope.go:117] "RemoveContainer" containerID="0479e45791f23fc97d375c97d7fa3c7220435eae00d8033bdb9fd35e46fabe0e" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.429214 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t7dww" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.434068 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cmmlx" event={"ID":"a83b6bd4-3813-465a-aa62-8bb029d2fcc0","Type":"ContainerStarted","Data":"279b472a959ea96cdca524bb12827eadc48920b234e55e7b0e4790223e2b8c94"} Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.441179 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.441206 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.441216 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dlzx\" (UniqueName: \"kubernetes.io/projected/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc-kube-api-access-5dlzx\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.442283 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fjb7k" event={"ID":"c92552a8-f85e-44d3-9b6f-d3614a6bc92a","Type":"ContainerDied","Data":"7a3330f0f461486f1fea81837f815f2b8d7e2dff7e5fff9d73cb19ba5f384fd2"} Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.442318 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a3330f0f461486f1fea81837f815f2b8d7e2dff7e5fff9d73cb19ba5f384fd2" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.442373 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fjb7k" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.450816 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-294lg" event={"ID":"6adc8ae9-dfbe-4c04-b844-b2fb424bda14","Type":"ContainerDied","Data":"5d35f38428bebe58aae6212e4749e16570cae904aea598445fdbe420e23d0047"} Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.450843 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-294lg" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.450853 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d35f38428bebe58aae6212e4749e16570cae904aea598445fdbe420e23d0047" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.454373 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qw6h6" event={"ID":"0c8c8852-9160-4156-aa38-7f8660a42c9c","Type":"ContainerDied","Data":"a1d222bcf2885ef8fd76fd31336d5211c0b03a8db14aa76853abeca97b5688af"} Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.454419 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1d222bcf2885ef8fd76fd31336d5211c0b03a8db14aa76853abeca97b5688af" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.454459 4776 scope.go:117] "RemoveContainer" containerID="33cf747c3abbf6ecb8c8d515760524c0a94a659c4e6bf147005f33cc34c64bfa" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.454592 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qw6h6" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.469044 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-1f0e-account-create-update-rrjzl"] Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.469109 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-cmmlx" podStartSLOduration=4.291273438 podStartE2EDuration="11.469082144s" podCreationTimestamp="2026-01-28 07:06:42 +0000 UTC" firstStartedPulling="2026-01-28 07:06:45.748851 +0000 UTC m=+977.164511180" lastFinishedPulling="2026-01-28 07:06:52.926659726 +0000 UTC m=+984.342319886" observedRunningTime="2026-01-28 07:06:53.462706741 +0000 UTC m=+984.878366921" watchObservedRunningTime="2026-01-28 07:06:53.469082144 +0000 UTC m=+984.884742324" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.470909 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9rmhk" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.471208 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9rmhk" event={"ID":"5c623f99-9905-4a7e-addc-022e48fb40bf","Type":"ContainerDied","Data":"ee44d8be9f5a30cfacd9839b33a841972ba6b2dea1e2c3ff5b6fad365a10b238"} Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.471238 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee44d8be9f5a30cfacd9839b33a841972ba6b2dea1e2c3ff5b6fad365a10b238" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.476957 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc6c-account-create-update-h8spp" event={"ID":"6d2eae23-aa55-4a27-a54c-b58d28da7b56","Type":"ContainerDied","Data":"e17751dc10eb15abe7a9276481a55b400f41f8f27fa3cf96196c0862c4328d1f"} Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.476994 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e17751dc10eb15abe7a9276481a55b400f41f8f27fa3cf96196c0862c4328d1f" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.477047 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc6c-account-create-update-h8spp" Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.494401 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t7dww"] Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.503348 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-9c8kt"] Jan 28 07:06:53 crc kubenswrapper[4776]: I0128 07:06:53.511425 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t7dww"] Jan 28 07:06:54 crc kubenswrapper[4776]: I0128 07:06:54.361618 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:06:54 crc kubenswrapper[4776]: E0128 07:06:54.361799 4776 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 07:06:54 crc kubenswrapper[4776]: E0128 07:06:54.362248 4776 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 07:06:54 crc kubenswrapper[4776]: E0128 07:06:54.362299 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift podName:331ac509-cce0-4545-ac41-1224aae65295 nodeName:}" failed. No retries permitted until 2026-01-28 07:07:10.362279495 +0000 UTC m=+1001.777939655 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift") pod "swift-storage-0" (UID: "331ac509-cce0-4545-ac41-1224aae65295") : configmap "swift-ring-files" not found Jan 28 07:06:54 crc kubenswrapper[4776]: I0128 07:06:54.486013 4776 generic.go:334] "Generic (PLEG): container finished" podID="5c4e7e46-a88a-44c2-8679-550be504407e" containerID="5d6c0273b6bbbd7bf666f8f1e1af4130d25302f353b0d9b75c92c540ea4b0c39" exitCode=0 Jan 28 07:06:54 crc kubenswrapper[4776]: I0128 07:06:54.486070 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-9c8kt" event={"ID":"5c4e7e46-a88a-44c2-8679-550be504407e","Type":"ContainerDied","Data":"5d6c0273b6bbbd7bf666f8f1e1af4130d25302f353b0d9b75c92c540ea4b0c39"} Jan 28 07:06:54 crc kubenswrapper[4776]: I0128 07:06:54.486093 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-9c8kt" event={"ID":"5c4e7e46-a88a-44c2-8679-550be504407e","Type":"ContainerStarted","Data":"029d198e9829fbb8011a5382b09f2f423c3e29ef4422cec909ff1572e2c934ba"} Jan 28 07:06:54 crc kubenswrapper[4776]: I0128 07:06:54.490008 4776 generic.go:334] "Generic (PLEG): container finished" podID="5a015c10-32ee-47e7-a5f4-21d5ccbe6e89" containerID="3e7f014b1fd104019485fa6d9d39c40d911d6966e2d7759bd0b4caed6a300aee" exitCode=0 Jan 28 07:06:54 crc kubenswrapper[4776]: I0128 07:06:54.490848 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-1f0e-account-create-update-rrjzl" event={"ID":"5a015c10-32ee-47e7-a5f4-21d5ccbe6e89","Type":"ContainerDied","Data":"3e7f014b1fd104019485fa6d9d39c40d911d6966e2d7759bd0b4caed6a300aee"} Jan 28 07:06:54 crc kubenswrapper[4776]: I0128 07:06:54.490880 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-1f0e-account-create-update-rrjzl" event={"ID":"5a015c10-32ee-47e7-a5f4-21d5ccbe6e89","Type":"ContainerStarted","Data":"43adbb9911ba29d0cf5493fd14e1c0382c206dbd1635ab1fc390ed8a87c3ebec"} Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.320490 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc" path="/var/lib/kubelet/pods/e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc/volumes" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.505039 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"544a6c48-5eb5-42f0-a46a-0a726d213341","Type":"ContainerStarted","Data":"893f47538cfb3d831779b83f9531245334da025f1f314bfe2726f8b5b6afbb8a"} Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.546893 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=11.342424437 podStartE2EDuration="48.546870874s" podCreationTimestamp="2026-01-28 07:06:07 +0000 UTC" firstStartedPulling="2026-01-28 07:06:17.977428576 +0000 UTC m=+949.393088736" lastFinishedPulling="2026-01-28 07:06:55.181875013 +0000 UTC m=+986.597535173" observedRunningTime="2026-01-28 07:06:55.544422728 +0000 UTC m=+986.960082888" watchObservedRunningTime="2026-01-28 07:06:55.546870874 +0000 UTC m=+986.962531054" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.973980 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-s2gfx"] Jan 28 07:06:55 crc kubenswrapper[4776]: E0128 07:06:55.974285 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc" containerName="init" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.974298 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc" containerName="init" Jan 28 07:06:55 crc kubenswrapper[4776]: E0128 07:06:55.974307 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c8c8852-9160-4156-aa38-7f8660a42c9c" containerName="mariadb-account-create-update" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.974314 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c8c8852-9160-4156-aa38-7f8660a42c9c" containerName="mariadb-account-create-update" Jan 28 07:06:55 crc kubenswrapper[4776]: E0128 07:06:55.974323 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e1e400-5c65-433a-a52e-01160edeb76a" containerName="mariadb-account-create-update" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.974329 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e1e400-5c65-433a-a52e-01160edeb76a" containerName="mariadb-account-create-update" Jan 28 07:06:55 crc kubenswrapper[4776]: E0128 07:06:55.974340 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8bcf822-d4ff-4cb2-b783-cea86d58ab8e" containerName="mariadb-account-create-update" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.974345 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8bcf822-d4ff-4cb2-b783-cea86d58ab8e" containerName="mariadb-account-create-update" Jan 28 07:06:55 crc kubenswrapper[4776]: E0128 07:06:55.974356 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c92552a8-f85e-44d3-9b6f-d3614a6bc92a" containerName="mariadb-database-create" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.974362 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92552a8-f85e-44d3-9b6f-d3614a6bc92a" containerName="mariadb-database-create" Jan 28 07:06:55 crc kubenswrapper[4776]: E0128 07:06:55.974372 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d2eae23-aa55-4a27-a54c-b58d28da7b56" containerName="mariadb-account-create-update" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.974378 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d2eae23-aa55-4a27-a54c-b58d28da7b56" containerName="mariadb-account-create-update" Jan 28 07:06:55 crc kubenswrapper[4776]: E0128 07:06:55.974389 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c623f99-9905-4a7e-addc-022e48fb40bf" containerName="mariadb-database-create" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.974395 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c623f99-9905-4a7e-addc-022e48fb40bf" containerName="mariadb-database-create" Jan 28 07:06:55 crc kubenswrapper[4776]: E0128 07:06:55.974401 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc" containerName="dnsmasq-dns" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.974407 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc" containerName="dnsmasq-dns" Jan 28 07:06:55 crc kubenswrapper[4776]: E0128 07:06:55.974420 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6adc8ae9-dfbe-4c04-b844-b2fb424bda14" containerName="mariadb-database-create" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.974425 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6adc8ae9-dfbe-4c04-b844-b2fb424bda14" containerName="mariadb-database-create" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.976643 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c92552a8-f85e-44d3-9b6f-d3614a6bc92a" containerName="mariadb-database-create" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.976665 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e1e400-5c65-433a-a52e-01160edeb76a" containerName="mariadb-account-create-update" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.976680 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c8c8852-9160-4156-aa38-7f8660a42c9c" containerName="mariadb-account-create-update" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.976686 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c623f99-9905-4a7e-addc-022e48fb40bf" containerName="mariadb-database-create" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.976697 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e365bfc7-99d2-4c63-b0ec-fdeafa6eb7bc" containerName="dnsmasq-dns" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.976713 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d2eae23-aa55-4a27-a54c-b58d28da7b56" containerName="mariadb-account-create-update" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.976719 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6adc8ae9-dfbe-4c04-b844-b2fb424bda14" containerName="mariadb-database-create" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.976730 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8bcf822-d4ff-4cb2-b783-cea86d58ab8e" containerName="mariadb-account-create-update" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.977808 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s2gfx" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.980316 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.981034 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-88s8h" Jan 28 07:06:55 crc kubenswrapper[4776]: I0128 07:06:55.985468 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-s2gfx"] Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.025050 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-1f0e-account-create-update-rrjzl" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.032228 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-9c8kt" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.091698 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49qvl\" (UniqueName: \"kubernetes.io/projected/5a015c10-32ee-47e7-a5f4-21d5ccbe6e89-kube-api-access-49qvl\") pod \"5a015c10-32ee-47e7-a5f4-21d5ccbe6e89\" (UID: \"5a015c10-32ee-47e7-a5f4-21d5ccbe6e89\") " Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.091752 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a015c10-32ee-47e7-a5f4-21d5ccbe6e89-operator-scripts\") pod \"5a015c10-32ee-47e7-a5f4-21d5ccbe6e89\" (UID: \"5a015c10-32ee-47e7-a5f4-21d5ccbe6e89\") " Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.091821 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c4e7e46-a88a-44c2-8679-550be504407e-operator-scripts\") pod \"5c4e7e46-a88a-44c2-8679-550be504407e\" (UID: \"5c4e7e46-a88a-44c2-8679-550be504407e\") " Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.091972 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2s7h\" (UniqueName: \"kubernetes.io/projected/5c4e7e46-a88a-44c2-8679-550be504407e-kube-api-access-f2s7h\") pod \"5c4e7e46-a88a-44c2-8679-550be504407e\" (UID: \"5c4e7e46-a88a-44c2-8679-550be504407e\") " Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.092154 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-db-sync-config-data\") pod \"glance-db-sync-s2gfx\" (UID: \"d8e5e104-629f-43f5-8372-dbe94e3938af\") " pod="openstack/glance-db-sync-s2gfx" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.092186 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xgmd\" (UniqueName: \"kubernetes.io/projected/d8e5e104-629f-43f5-8372-dbe94e3938af-kube-api-access-8xgmd\") pod \"glance-db-sync-s2gfx\" (UID: \"d8e5e104-629f-43f5-8372-dbe94e3938af\") " pod="openstack/glance-db-sync-s2gfx" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.092246 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-config-data\") pod \"glance-db-sync-s2gfx\" (UID: \"d8e5e104-629f-43f5-8372-dbe94e3938af\") " pod="openstack/glance-db-sync-s2gfx" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.092264 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-combined-ca-bundle\") pod \"glance-db-sync-s2gfx\" (UID: \"d8e5e104-629f-43f5-8372-dbe94e3938af\") " pod="openstack/glance-db-sync-s2gfx" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.095796 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a015c10-32ee-47e7-a5f4-21d5ccbe6e89-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5a015c10-32ee-47e7-a5f4-21d5ccbe6e89" (UID: "5a015c10-32ee-47e7-a5f4-21d5ccbe6e89"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.095906 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c4e7e46-a88a-44c2-8679-550be504407e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c4e7e46-a88a-44c2-8679-550be504407e" (UID: "5c4e7e46-a88a-44c2-8679-550be504407e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.106378 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a015c10-32ee-47e7-a5f4-21d5ccbe6e89-kube-api-access-49qvl" (OuterVolumeSpecName: "kube-api-access-49qvl") pod "5a015c10-32ee-47e7-a5f4-21d5ccbe6e89" (UID: "5a015c10-32ee-47e7-a5f4-21d5ccbe6e89"). InnerVolumeSpecName "kube-api-access-49qvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.110841 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c4e7e46-a88a-44c2-8679-550be504407e-kube-api-access-f2s7h" (OuterVolumeSpecName: "kube-api-access-f2s7h") pod "5c4e7e46-a88a-44c2-8679-550be504407e" (UID: "5c4e7e46-a88a-44c2-8679-550be504407e"). InnerVolumeSpecName "kube-api-access-f2s7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.193763 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-db-sync-config-data\") pod \"glance-db-sync-s2gfx\" (UID: \"d8e5e104-629f-43f5-8372-dbe94e3938af\") " pod="openstack/glance-db-sync-s2gfx" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.193818 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xgmd\" (UniqueName: \"kubernetes.io/projected/d8e5e104-629f-43f5-8372-dbe94e3938af-kube-api-access-8xgmd\") pod \"glance-db-sync-s2gfx\" (UID: \"d8e5e104-629f-43f5-8372-dbe94e3938af\") " pod="openstack/glance-db-sync-s2gfx" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.193886 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-config-data\") pod \"glance-db-sync-s2gfx\" (UID: \"d8e5e104-629f-43f5-8372-dbe94e3938af\") " pod="openstack/glance-db-sync-s2gfx" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.193911 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-combined-ca-bundle\") pod \"glance-db-sync-s2gfx\" (UID: \"d8e5e104-629f-43f5-8372-dbe94e3938af\") " pod="openstack/glance-db-sync-s2gfx" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.194002 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2s7h\" (UniqueName: \"kubernetes.io/projected/5c4e7e46-a88a-44c2-8679-550be504407e-kube-api-access-f2s7h\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.194012 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49qvl\" (UniqueName: \"kubernetes.io/projected/5a015c10-32ee-47e7-a5f4-21d5ccbe6e89-kube-api-access-49qvl\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.194023 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a015c10-32ee-47e7-a5f4-21d5ccbe6e89-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.194032 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c4e7e46-a88a-44c2-8679-550be504407e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.198818 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-combined-ca-bundle\") pod \"glance-db-sync-s2gfx\" (UID: \"d8e5e104-629f-43f5-8372-dbe94e3938af\") " pod="openstack/glance-db-sync-s2gfx" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.199030 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-config-data\") pod \"glance-db-sync-s2gfx\" (UID: \"d8e5e104-629f-43f5-8372-dbe94e3938af\") " pod="openstack/glance-db-sync-s2gfx" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.198939 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-db-sync-config-data\") pod \"glance-db-sync-s2gfx\" (UID: \"d8e5e104-629f-43f5-8372-dbe94e3938af\") " pod="openstack/glance-db-sync-s2gfx" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.211034 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xgmd\" (UniqueName: \"kubernetes.io/projected/d8e5e104-629f-43f5-8372-dbe94e3938af-kube-api-access-8xgmd\") pod \"glance-db-sync-s2gfx\" (UID: \"d8e5e104-629f-43f5-8372-dbe94e3938af\") " pod="openstack/glance-db-sync-s2gfx" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.342159 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s2gfx" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.537705 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-9c8kt" event={"ID":"5c4e7e46-a88a-44c2-8679-550be504407e","Type":"ContainerDied","Data":"029d198e9829fbb8011a5382b09f2f423c3e29ef4422cec909ff1572e2c934ba"} Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.537744 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="029d198e9829fbb8011a5382b09f2f423c3e29ef4422cec909ff1572e2c934ba" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.537786 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-9c8kt" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.541606 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-1f0e-account-create-update-rrjzl" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.541677 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-1f0e-account-create-update-rrjzl" event={"ID":"5a015c10-32ee-47e7-a5f4-21d5ccbe6e89","Type":"ContainerDied","Data":"43adbb9911ba29d0cf5493fd14e1c0382c206dbd1635ab1fc390ed8a87c3ebec"} Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.541718 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43adbb9911ba29d0cf5493fd14e1c0382c206dbd1635ab1fc390ed8a87c3ebec" Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.801145 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 28 07:06:56 crc kubenswrapper[4776]: W0128 07:06:56.955404 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8e5e104_629f_43f5_8372_dbe94e3938af.slice/crio-66bf839771bf4a9776f8cc7ff1a24c09f93cd4751e3e738c4b0636dbc657f560 WatchSource:0}: Error finding container 66bf839771bf4a9776f8cc7ff1a24c09f93cd4751e3e738c4b0636dbc657f560: Status 404 returned error can't find the container with id 66bf839771bf4a9776f8cc7ff1a24c09f93cd4751e3e738c4b0636dbc657f560 Jan 28 07:06:56 crc kubenswrapper[4776]: I0128 07:06:56.956120 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-s2gfx"] Jan 28 07:06:57 crc kubenswrapper[4776]: I0128 07:06:57.555362 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s2gfx" event={"ID":"d8e5e104-629f-43f5-8372-dbe94e3938af","Type":"ContainerStarted","Data":"66bf839771bf4a9776f8cc7ff1a24c09f93cd4751e3e738c4b0636dbc657f560"} Jan 28 07:06:57 crc kubenswrapper[4776]: I0128 07:06:57.759609 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-qw6h6"] Jan 28 07:06:57 crc kubenswrapper[4776]: I0128 07:06:57.768072 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-qw6h6"] Jan 28 07:06:58 crc kubenswrapper[4776]: I0128 07:06:58.760312 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 28 07:06:59 crc kubenswrapper[4776]: I0128 07:06:59.008306 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-rm2d7"] Jan 28 07:06:59 crc kubenswrapper[4776]: E0128 07:06:59.008708 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a015c10-32ee-47e7-a5f4-21d5ccbe6e89" containerName="mariadb-account-create-update" Jan 28 07:06:59 crc kubenswrapper[4776]: I0128 07:06:59.008737 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a015c10-32ee-47e7-a5f4-21d5ccbe6e89" containerName="mariadb-account-create-update" Jan 28 07:06:59 crc kubenswrapper[4776]: E0128 07:06:59.008775 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4e7e46-a88a-44c2-8679-550be504407e" containerName="mariadb-database-create" Jan 28 07:06:59 crc kubenswrapper[4776]: I0128 07:06:59.008785 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4e7e46-a88a-44c2-8679-550be504407e" containerName="mariadb-database-create" Jan 28 07:06:59 crc kubenswrapper[4776]: I0128 07:06:59.009018 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4e7e46-a88a-44c2-8679-550be504407e" containerName="mariadb-database-create" Jan 28 07:06:59 crc kubenswrapper[4776]: I0128 07:06:59.009060 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a015c10-32ee-47e7-a5f4-21d5ccbe6e89" containerName="mariadb-account-create-update" Jan 28 07:06:59 crc kubenswrapper[4776]: I0128 07:06:59.009788 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rm2d7" Jan 28 07:06:59 crc kubenswrapper[4776]: I0128 07:06:59.011637 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 28 07:06:59 crc kubenswrapper[4776]: I0128 07:06:59.022493 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rm2d7"] Jan 28 07:06:59 crc kubenswrapper[4776]: I0128 07:06:59.156395 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b35a7426-2d99-402a-8de1-297352679817-operator-scripts\") pod \"root-account-create-update-rm2d7\" (UID: \"b35a7426-2d99-402a-8de1-297352679817\") " pod="openstack/root-account-create-update-rm2d7" Jan 28 07:06:59 crc kubenswrapper[4776]: I0128 07:06:59.156577 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mghnw\" (UniqueName: \"kubernetes.io/projected/b35a7426-2d99-402a-8de1-297352679817-kube-api-access-mghnw\") pod \"root-account-create-update-rm2d7\" (UID: \"b35a7426-2d99-402a-8de1-297352679817\") " pod="openstack/root-account-create-update-rm2d7" Jan 28 07:06:59 crc kubenswrapper[4776]: I0128 07:06:59.257493 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mghnw\" (UniqueName: \"kubernetes.io/projected/b35a7426-2d99-402a-8de1-297352679817-kube-api-access-mghnw\") pod \"root-account-create-update-rm2d7\" (UID: \"b35a7426-2d99-402a-8de1-297352679817\") " pod="openstack/root-account-create-update-rm2d7" Jan 28 07:06:59 crc kubenswrapper[4776]: I0128 07:06:59.257603 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b35a7426-2d99-402a-8de1-297352679817-operator-scripts\") pod \"root-account-create-update-rm2d7\" (UID: \"b35a7426-2d99-402a-8de1-297352679817\") " pod="openstack/root-account-create-update-rm2d7" Jan 28 07:06:59 crc kubenswrapper[4776]: I0128 07:06:59.258276 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b35a7426-2d99-402a-8de1-297352679817-operator-scripts\") pod \"root-account-create-update-rm2d7\" (UID: \"b35a7426-2d99-402a-8de1-297352679817\") " pod="openstack/root-account-create-update-rm2d7" Jan 28 07:06:59 crc kubenswrapper[4776]: I0128 07:06:59.281074 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mghnw\" (UniqueName: \"kubernetes.io/projected/b35a7426-2d99-402a-8de1-297352679817-kube-api-access-mghnw\") pod \"root-account-create-update-rm2d7\" (UID: \"b35a7426-2d99-402a-8de1-297352679817\") " pod="openstack/root-account-create-update-rm2d7" Jan 28 07:06:59 crc kubenswrapper[4776]: I0128 07:06:59.330228 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rm2d7" Jan 28 07:06:59 crc kubenswrapper[4776]: I0128 07:06:59.344163 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c8c8852-9160-4156-aa38-7f8660a42c9c" path="/var/lib/kubelet/pods/0c8c8852-9160-4156-aa38-7f8660a42c9c/volumes" Jan 28 07:07:00 crc kubenswrapper[4776]: I0128 07:07:00.509133 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rm2d7"] Jan 28 07:07:00 crc kubenswrapper[4776]: W0128 07:07:00.518781 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb35a7426_2d99_402a_8de1_297352679817.slice/crio-bec744cf95fa78e4366d1201865192f80285aa8330565158b0791c8f00071b0c WatchSource:0}: Error finding container bec744cf95fa78e4366d1201865192f80285aa8330565158b0791c8f00071b0c: Status 404 returned error can't find the container with id bec744cf95fa78e4366d1201865192f80285aa8330565158b0791c8f00071b0c Jan 28 07:07:00 crc kubenswrapper[4776]: I0128 07:07:00.581406 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rm2d7" event={"ID":"b35a7426-2d99-402a-8de1-297352679817","Type":"ContainerStarted","Data":"bec744cf95fa78e4366d1201865192f80285aa8330565158b0791c8f00071b0c"} Jan 28 07:07:00 crc kubenswrapper[4776]: I0128 07:07:00.585631 4776 generic.go:334] "Generic (PLEG): container finished" podID="a83b6bd4-3813-465a-aa62-8bb029d2fcc0" containerID="279b472a959ea96cdca524bb12827eadc48920b234e55e7b0e4790223e2b8c94" exitCode=0 Jan 28 07:07:00 crc kubenswrapper[4776]: I0128 07:07:00.585678 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cmmlx" event={"ID":"a83b6bd4-3813-465a-aa62-8bb029d2fcc0","Type":"ContainerDied","Data":"279b472a959ea96cdca524bb12827eadc48920b234e55e7b0e4790223e2b8c94"} Jan 28 07:07:01 crc kubenswrapper[4776]: I0128 07:07:01.961535 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.105200 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htc6p\" (UniqueName: \"kubernetes.io/projected/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-kube-api-access-htc6p\") pod \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.105260 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-combined-ca-bundle\") pod \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.105305 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-swiftconf\") pod \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.105405 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-etc-swift\") pod \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.105421 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-dispersionconf\") pod \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.105492 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-ring-data-devices\") pod \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.105521 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-scripts\") pod \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\" (UID: \"a83b6bd4-3813-465a-aa62-8bb029d2fcc0\") " Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.107512 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a83b6bd4-3813-465a-aa62-8bb029d2fcc0" (UID: "a83b6bd4-3813-465a-aa62-8bb029d2fcc0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.107908 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a83b6bd4-3813-465a-aa62-8bb029d2fcc0" (UID: "a83b6bd4-3813-465a-aa62-8bb029d2fcc0"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.129978 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-kube-api-access-htc6p" (OuterVolumeSpecName: "kube-api-access-htc6p") pod "a83b6bd4-3813-465a-aa62-8bb029d2fcc0" (UID: "a83b6bd4-3813-465a-aa62-8bb029d2fcc0"). InnerVolumeSpecName "kube-api-access-htc6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.138510 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-scripts" (OuterVolumeSpecName: "scripts") pod "a83b6bd4-3813-465a-aa62-8bb029d2fcc0" (UID: "a83b6bd4-3813-465a-aa62-8bb029d2fcc0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.139415 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a83b6bd4-3813-465a-aa62-8bb029d2fcc0" (UID: "a83b6bd4-3813-465a-aa62-8bb029d2fcc0"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.148097 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a83b6bd4-3813-465a-aa62-8bb029d2fcc0" (UID: "a83b6bd4-3813-465a-aa62-8bb029d2fcc0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.148607 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a83b6bd4-3813-465a-aa62-8bb029d2fcc0" (UID: "a83b6bd4-3813-465a-aa62-8bb029d2fcc0"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.207738 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.207785 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htc6p\" (UniqueName: \"kubernetes.io/projected/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-kube-api-access-htc6p\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.207806 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.207823 4776 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.207842 4776 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.207859 4776 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.207876 4776 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a83b6bd4-3813-465a-aa62-8bb029d2fcc0-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.342937 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lrzjl" podUID="4d2cb31b-ab97-4714-9978-225821819328" containerName="ovn-controller" probeResult="failure" output=< Jan 28 07:07:02 crc kubenswrapper[4776]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 28 07:07:02 crc kubenswrapper[4776]: > Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.403661 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.405866 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-mbldl" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.606127 4776 generic.go:334] "Generic (PLEG): container finished" podID="c544ad4a-db14-419a-b423-435e8416f597" containerID="00df2257a649f71215e07ba7ed61ff51fdbcfe1d66c980c13b5ccb5bd6f0511d" exitCode=0 Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.606200 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c544ad4a-db14-419a-b423-435e8416f597","Type":"ContainerDied","Data":"00df2257a649f71215e07ba7ed61ff51fdbcfe1d66c980c13b5ccb5bd6f0511d"} Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.607747 4776 generic.go:334] "Generic (PLEG): container finished" podID="b35a7426-2d99-402a-8de1-297352679817" containerID="47644434cf328c5ab572708b384e78051d8b9c97b2853d2584103061f2009d63" exitCode=0 Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.607821 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rm2d7" event={"ID":"b35a7426-2d99-402a-8de1-297352679817","Type":"ContainerDied","Data":"47644434cf328c5ab572708b384e78051d8b9c97b2853d2584103061f2009d63"} Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.609441 4776 generic.go:334] "Generic (PLEG): container finished" podID="0aae9df9-4aee-48fa-aa96-4f93f55be39f" containerID="5779391dd89a09e63052867e602f4c12048c07c8fc455061db7d0096bcec5503" exitCode=0 Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.609517 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0aae9df9-4aee-48fa-aa96-4f93f55be39f","Type":"ContainerDied","Data":"5779391dd89a09e63052867e602f4c12048c07c8fc455061db7d0096bcec5503"} Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.613904 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cmmlx" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.614385 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cmmlx" event={"ID":"a83b6bd4-3813-465a-aa62-8bb029d2fcc0","Type":"ContainerDied","Data":"f1c080a0dcb1f594b8715810431d3503f72f1a88d14fa5a50df651d8b0daa02b"} Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.614488 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1c080a0dcb1f594b8715810431d3503f72f1a88d14fa5a50df651d8b0daa02b" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.654045 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lrzjl-config-wm4wv"] Jan 28 07:07:02 crc kubenswrapper[4776]: E0128 07:07:02.654492 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a83b6bd4-3813-465a-aa62-8bb029d2fcc0" containerName="swift-ring-rebalance" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.654515 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a83b6bd4-3813-465a-aa62-8bb029d2fcc0" containerName="swift-ring-rebalance" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.654800 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a83b6bd4-3813-465a-aa62-8bb029d2fcc0" containerName="swift-ring-rebalance" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.655520 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.658942 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.685720 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lrzjl-config-wm4wv"] Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.723169 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-run-ovn\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.723338 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/368217e4-4227-4a27-a06f-6289ea05414a-additional-scripts\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.723831 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/368217e4-4227-4a27-a06f-6289ea05414a-scripts\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.723923 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-log-ovn\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.723979 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-run\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.724014 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvf8k\" (UniqueName: \"kubernetes.io/projected/368217e4-4227-4a27-a06f-6289ea05414a-kube-api-access-mvf8k\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.825444 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-log-ovn\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.825496 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-run\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.825526 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvf8k\" (UniqueName: \"kubernetes.io/projected/368217e4-4227-4a27-a06f-6289ea05414a-kube-api-access-mvf8k\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.825573 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-run-ovn\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.825636 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/368217e4-4227-4a27-a06f-6289ea05414a-additional-scripts\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.825730 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/368217e4-4227-4a27-a06f-6289ea05414a-scripts\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.825912 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-run-ovn\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.826419 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-log-ovn\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.826619 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/368217e4-4227-4a27-a06f-6289ea05414a-additional-scripts\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.827053 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-run\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.828098 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/368217e4-4227-4a27-a06f-6289ea05414a-scripts\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:02 crc kubenswrapper[4776]: I0128 07:07:02.841247 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvf8k\" (UniqueName: \"kubernetes.io/projected/368217e4-4227-4a27-a06f-6289ea05414a-kube-api-access-mvf8k\") pod \"ovn-controller-lrzjl-config-wm4wv\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:03 crc kubenswrapper[4776]: I0128 07:07:03.012105 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:03 crc kubenswrapper[4776]: I0128 07:07:03.852288 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:07:03 crc kubenswrapper[4776]: I0128 07:07:03.852342 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:07:07 crc kubenswrapper[4776]: I0128 07:07:07.317982 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lrzjl" podUID="4d2cb31b-ab97-4714-9978-225821819328" containerName="ovn-controller" probeResult="failure" output=< Jan 28 07:07:07 crc kubenswrapper[4776]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 28 07:07:07 crc kubenswrapper[4776]: > Jan 28 07:07:08 crc kubenswrapper[4776]: I0128 07:07:08.759892 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:08 crc kubenswrapper[4776]: I0128 07:07:08.763209 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:09 crc kubenswrapper[4776]: I0128 07:07:09.576314 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rm2d7" Jan 28 07:07:09 crc kubenswrapper[4776]: I0128 07:07:09.700131 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c544ad4a-db14-419a-b423-435e8416f597","Type":"ContainerStarted","Data":"ad54de427c0b143e1a6651cac9dbacb6c50ef52e4a818312af689c999ed3fc92"} Jan 28 07:07:09 crc kubenswrapper[4776]: I0128 07:07:09.701330 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:07:09 crc kubenswrapper[4776]: I0128 07:07:09.706486 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rm2d7" Jan 28 07:07:09 crc kubenswrapper[4776]: I0128 07:07:09.706775 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rm2d7" event={"ID":"b35a7426-2d99-402a-8de1-297352679817","Type":"ContainerDied","Data":"bec744cf95fa78e4366d1201865192f80285aa8330565158b0791c8f00071b0c"} Jan 28 07:07:09 crc kubenswrapper[4776]: I0128 07:07:09.706798 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bec744cf95fa78e4366d1201865192f80285aa8330565158b0791c8f00071b0c" Jan 28 07:07:09 crc kubenswrapper[4776]: I0128 07:07:09.708480 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:09 crc kubenswrapper[4776]: I0128 07:07:09.732043 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=57.627622092 podStartE2EDuration="1m8.732019416s" podCreationTimestamp="2026-01-28 07:06:01 +0000 UTC" firstStartedPulling="2026-01-28 07:06:15.863146831 +0000 UTC m=+947.278806991" lastFinishedPulling="2026-01-28 07:06:26.967544155 +0000 UTC m=+958.383204315" observedRunningTime="2026-01-28 07:07:09.726564388 +0000 UTC m=+1001.142224548" watchObservedRunningTime="2026-01-28 07:07:09.732019416 +0000 UTC m=+1001.147679586" Jan 28 07:07:09 crc kubenswrapper[4776]: I0128 07:07:09.746352 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b35a7426-2d99-402a-8de1-297352679817-operator-scripts\") pod \"b35a7426-2d99-402a-8de1-297352679817\" (UID: \"b35a7426-2d99-402a-8de1-297352679817\") " Jan 28 07:07:09 crc kubenswrapper[4776]: I0128 07:07:09.746408 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mghnw\" (UniqueName: \"kubernetes.io/projected/b35a7426-2d99-402a-8de1-297352679817-kube-api-access-mghnw\") pod \"b35a7426-2d99-402a-8de1-297352679817\" (UID: \"b35a7426-2d99-402a-8de1-297352679817\") " Jan 28 07:07:09 crc kubenswrapper[4776]: I0128 07:07:09.747293 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b35a7426-2d99-402a-8de1-297352679817-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b35a7426-2d99-402a-8de1-297352679817" (UID: "b35a7426-2d99-402a-8de1-297352679817"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:09 crc kubenswrapper[4776]: I0128 07:07:09.752099 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b35a7426-2d99-402a-8de1-297352679817-kube-api-access-mghnw" (OuterVolumeSpecName: "kube-api-access-mghnw") pod "b35a7426-2d99-402a-8de1-297352679817" (UID: "b35a7426-2d99-402a-8de1-297352679817"). InnerVolumeSpecName "kube-api-access-mghnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:09 crc kubenswrapper[4776]: I0128 07:07:09.848342 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b35a7426-2d99-402a-8de1-297352679817-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:09 crc kubenswrapper[4776]: I0128 07:07:09.848375 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mghnw\" (UniqueName: \"kubernetes.io/projected/b35a7426-2d99-402a-8de1-297352679817-kube-api-access-mghnw\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:09 crc kubenswrapper[4776]: I0128 07:07:09.965480 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lrzjl-config-wm4wv"] Jan 28 07:07:09 crc kubenswrapper[4776]: W0128 07:07:09.968517 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod368217e4_4227_4a27_a06f_6289ea05414a.slice/crio-3b20857a553e00ca09f832de089a70e6cb0ad41002d7310f8eb0b5fe1cf86a5d WatchSource:0}: Error finding container 3b20857a553e00ca09f832de089a70e6cb0ad41002d7310f8eb0b5fe1cf86a5d: Status 404 returned error can't find the container with id 3b20857a553e00ca09f832de089a70e6cb0ad41002d7310f8eb0b5fe1cf86a5d Jan 28 07:07:10 crc kubenswrapper[4776]: I0128 07:07:10.362860 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:07:10 crc kubenswrapper[4776]: I0128 07:07:10.369402 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/331ac509-cce0-4545-ac41-1224aae65295-etc-swift\") pod \"swift-storage-0\" (UID: \"331ac509-cce0-4545-ac41-1224aae65295\") " pod="openstack/swift-storage-0" Jan 28 07:07:10 crc kubenswrapper[4776]: I0128 07:07:10.436337 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 28 07:07:10 crc kubenswrapper[4776]: I0128 07:07:10.718305 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0aae9df9-4aee-48fa-aa96-4f93f55be39f","Type":"ContainerStarted","Data":"0b881277df6a126ad34bd8b1993d81a7a7c80a1b5531cec40707a0b17ac70562"} Jan 28 07:07:10 crc kubenswrapper[4776]: I0128 07:07:10.718734 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 28 07:07:10 crc kubenswrapper[4776]: I0128 07:07:10.719709 4776 generic.go:334] "Generic (PLEG): container finished" podID="368217e4-4227-4a27-a06f-6289ea05414a" containerID="b7b5ebe2d3e811df95cce47701109c338d599e4ddf62916a3520cd5be0d9bf38" exitCode=0 Jan 28 07:07:10 crc kubenswrapper[4776]: I0128 07:07:10.719778 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lrzjl-config-wm4wv" event={"ID":"368217e4-4227-4a27-a06f-6289ea05414a","Type":"ContainerDied","Data":"b7b5ebe2d3e811df95cce47701109c338d599e4ddf62916a3520cd5be0d9bf38"} Jan 28 07:07:10 crc kubenswrapper[4776]: I0128 07:07:10.719804 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lrzjl-config-wm4wv" event={"ID":"368217e4-4227-4a27-a06f-6289ea05414a","Type":"ContainerStarted","Data":"3b20857a553e00ca09f832de089a70e6cb0ad41002d7310f8eb0b5fe1cf86a5d"} Jan 28 07:07:10 crc kubenswrapper[4776]: I0128 07:07:10.720834 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s2gfx" event={"ID":"d8e5e104-629f-43f5-8372-dbe94e3938af","Type":"ContainerStarted","Data":"9407c576485ccdcd24787765b27e173c32a324fd9d1c97fec5310897541eb1f2"} Jan 28 07:07:10 crc kubenswrapper[4776]: I0128 07:07:10.753786 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=59.501382385 podStartE2EDuration="1m9.753743825s" podCreationTimestamp="2026-01-28 07:06:01 +0000 UTC" firstStartedPulling="2026-01-28 07:06:17.416081178 +0000 UTC m=+948.831741338" lastFinishedPulling="2026-01-28 07:06:27.668442618 +0000 UTC m=+959.084102778" observedRunningTime="2026-01-28 07:07:10.747188357 +0000 UTC m=+1002.162848547" watchObservedRunningTime="2026-01-28 07:07:10.753743825 +0000 UTC m=+1002.169404005" Jan 28 07:07:10 crc kubenswrapper[4776]: I0128 07:07:10.800524 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-s2gfx" podStartSLOduration=3.277244709 podStartE2EDuration="15.800502917s" podCreationTimestamp="2026-01-28 07:06:55 +0000 UTC" firstStartedPulling="2026-01-28 07:06:56.959277081 +0000 UTC m=+988.374937241" lastFinishedPulling="2026-01-28 07:07:09.482535249 +0000 UTC m=+1000.898195449" observedRunningTime="2026-01-28 07:07:10.792750116 +0000 UTC m=+1002.208410286" watchObservedRunningTime="2026-01-28 07:07:10.800502917 +0000 UTC m=+1002.216163077" Jan 28 07:07:11 crc kubenswrapper[4776]: I0128 07:07:11.553794 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 28 07:07:11 crc kubenswrapper[4776]: I0128 07:07:11.733232 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"331ac509-cce0-4545-ac41-1224aae65295","Type":"ContainerStarted","Data":"e424d8a94df643505bde0407474f422c7bd7b487fa2b47c562a07674fac84c2a"} Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.182863 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.294150 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvf8k\" (UniqueName: \"kubernetes.io/projected/368217e4-4227-4a27-a06f-6289ea05414a-kube-api-access-mvf8k\") pod \"368217e4-4227-4a27-a06f-6289ea05414a\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.294205 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-run\") pod \"368217e4-4227-4a27-a06f-6289ea05414a\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.294270 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/368217e4-4227-4a27-a06f-6289ea05414a-scripts\") pod \"368217e4-4227-4a27-a06f-6289ea05414a\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.294339 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/368217e4-4227-4a27-a06f-6289ea05414a-additional-scripts\") pod \"368217e4-4227-4a27-a06f-6289ea05414a\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.294361 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-run" (OuterVolumeSpecName: "var-run") pod "368217e4-4227-4a27-a06f-6289ea05414a" (UID: "368217e4-4227-4a27-a06f-6289ea05414a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.294381 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-run-ovn\") pod \"368217e4-4227-4a27-a06f-6289ea05414a\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.294416 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "368217e4-4227-4a27-a06f-6289ea05414a" (UID: "368217e4-4227-4a27-a06f-6289ea05414a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.294525 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-log-ovn\") pod \"368217e4-4227-4a27-a06f-6289ea05414a\" (UID: \"368217e4-4227-4a27-a06f-6289ea05414a\") " Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.294580 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "368217e4-4227-4a27-a06f-6289ea05414a" (UID: "368217e4-4227-4a27-a06f-6289ea05414a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.295237 4776 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-run\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.295256 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368217e4-4227-4a27-a06f-6289ea05414a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "368217e4-4227-4a27-a06f-6289ea05414a" (UID: "368217e4-4227-4a27-a06f-6289ea05414a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.295268 4776 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.295302 4776 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/368217e4-4227-4a27-a06f-6289ea05414a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.295779 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368217e4-4227-4a27-a06f-6289ea05414a-scripts" (OuterVolumeSpecName: "scripts") pod "368217e4-4227-4a27-a06f-6289ea05414a" (UID: "368217e4-4227-4a27-a06f-6289ea05414a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.307355 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/368217e4-4227-4a27-a06f-6289ea05414a-kube-api-access-mvf8k" (OuterVolumeSpecName: "kube-api-access-mvf8k") pod "368217e4-4227-4a27-a06f-6289ea05414a" (UID: "368217e4-4227-4a27-a06f-6289ea05414a"). InnerVolumeSpecName "kube-api-access-mvf8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.329265 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-lrzjl" Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.397265 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvf8k\" (UniqueName: \"kubernetes.io/projected/368217e4-4227-4a27-a06f-6289ea05414a-kube-api-access-mvf8k\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.397309 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/368217e4-4227-4a27-a06f-6289ea05414a-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.397320 4776 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/368217e4-4227-4a27-a06f-6289ea05414a-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.529458 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.530146 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerName="prometheus" containerID="cri-o://88bc359bfc0f27f92955825e4558579ce8d0019689935da73440c43347421876" gracePeriod=600 Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.530205 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerName="config-reloader" containerID="cri-o://00e723df3061d416830cea707b8efa99babb8de270569f1ff9c43c05bfba0967" gracePeriod=600 Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.530176 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerName="thanos-sidecar" containerID="cri-o://893f47538cfb3d831779b83f9531245334da025f1f314bfe2726f8b5b6afbb8a" gracePeriod=600 Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.743393 4776 generic.go:334] "Generic (PLEG): container finished" podID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerID="893f47538cfb3d831779b83f9531245334da025f1f314bfe2726f8b5b6afbb8a" exitCode=0 Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.743423 4776 generic.go:334] "Generic (PLEG): container finished" podID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerID="88bc359bfc0f27f92955825e4558579ce8d0019689935da73440c43347421876" exitCode=0 Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.743467 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"544a6c48-5eb5-42f0-a46a-0a726d213341","Type":"ContainerDied","Data":"893f47538cfb3d831779b83f9531245334da025f1f314bfe2726f8b5b6afbb8a"} Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.743508 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"544a6c48-5eb5-42f0-a46a-0a726d213341","Type":"ContainerDied","Data":"88bc359bfc0f27f92955825e4558579ce8d0019689935da73440c43347421876"} Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.745372 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lrzjl-config-wm4wv" event={"ID":"368217e4-4227-4a27-a06f-6289ea05414a","Type":"ContainerDied","Data":"3b20857a553e00ca09f832de089a70e6cb0ad41002d7310f8eb0b5fe1cf86a5d"} Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.745397 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b20857a553e00ca09f832de089a70e6cb0ad41002d7310f8eb0b5fe1cf86a5d" Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.745457 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lrzjl-config-wm4wv" Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.837286 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-rm2d7"] Jan 28 07:07:12 crc kubenswrapper[4776]: I0128 07:07:12.844143 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-rm2d7"] Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.292609 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lrzjl-config-wm4wv"] Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.329005 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b35a7426-2d99-402a-8de1-297352679817" path="/var/lib/kubelet/pods/b35a7426-2d99-402a-8de1-297352679817/volumes" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.329595 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lrzjl-config-wm4wv"] Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.341651 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lrzjl-config-6626j"] Jan 28 07:07:13 crc kubenswrapper[4776]: E0128 07:07:13.342181 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b35a7426-2d99-402a-8de1-297352679817" containerName="mariadb-account-create-update" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.342206 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b35a7426-2d99-402a-8de1-297352679817" containerName="mariadb-account-create-update" Jan 28 07:07:13 crc kubenswrapper[4776]: E0128 07:07:13.342221 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368217e4-4227-4a27-a06f-6289ea05414a" containerName="ovn-config" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.342231 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="368217e4-4227-4a27-a06f-6289ea05414a" containerName="ovn-config" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.342413 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="368217e4-4227-4a27-a06f-6289ea05414a" containerName="ovn-config" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.342447 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="b35a7426-2d99-402a-8de1-297352679817" containerName="mariadb-account-create-update" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.343181 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.344963 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.355713 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lrzjl-config-6626j"] Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.412276 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2d61b61-941b-4635-92fb-2425af02183b-scripts\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.412318 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v9dg\" (UniqueName: \"kubernetes.io/projected/c2d61b61-941b-4635-92fb-2425af02183b-kube-api-access-8v9dg\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.412354 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-log-ovn\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.412380 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c2d61b61-941b-4635-92fb-2425af02183b-additional-scripts\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.412481 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-run\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.412524 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-run-ovn\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.514126 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-run\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.514444 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-run-ovn\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.514489 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2d61b61-941b-4635-92fb-2425af02183b-scripts\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.514511 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v9dg\" (UniqueName: \"kubernetes.io/projected/c2d61b61-941b-4635-92fb-2425af02183b-kube-api-access-8v9dg\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.514618 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-run\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.514654 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-log-ovn\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.514690 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c2d61b61-941b-4635-92fb-2425af02183b-additional-scripts\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.515126 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-log-ovn\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.515127 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-run-ovn\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.515793 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c2d61b61-941b-4635-92fb-2425af02183b-additional-scripts\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.516604 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2d61b61-941b-4635-92fb-2425af02183b-scripts\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.528315 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.537379 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v9dg\" (UniqueName: \"kubernetes.io/projected/c2d61b61-941b-4635-92fb-2425af02183b-kube-api-access-8v9dg\") pod \"ovn-controller-lrzjl-config-6626j\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.615581 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-1\") pod \"544a6c48-5eb5-42f0-a46a-0a726d213341\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.615645 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-thanos-prometheus-http-client-file\") pod \"544a6c48-5eb5-42f0-a46a-0a726d213341\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.615690 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/544a6c48-5eb5-42f0-a46a-0a726d213341-tls-assets\") pod \"544a6c48-5eb5-42f0-a46a-0a726d213341\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.615740 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/544a6c48-5eb5-42f0-a46a-0a726d213341-config-out\") pod \"544a6c48-5eb5-42f0-a46a-0a726d213341\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.615796 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-0\") pod \"544a6c48-5eb5-42f0-a46a-0a726d213341\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.615821 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-2\") pod \"544a6c48-5eb5-42f0-a46a-0a726d213341\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.615948 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") pod \"544a6c48-5eb5-42f0-a46a-0a726d213341\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.616037 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-config\") pod \"544a6c48-5eb5-42f0-a46a-0a726d213341\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.616067 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4246\" (UniqueName: \"kubernetes.io/projected/544a6c48-5eb5-42f0-a46a-0a726d213341-kube-api-access-f4246\") pod \"544a6c48-5eb5-42f0-a46a-0a726d213341\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.616096 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-web-config\") pod \"544a6c48-5eb5-42f0-a46a-0a726d213341\" (UID: \"544a6c48-5eb5-42f0-a46a-0a726d213341\") " Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.616436 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "544a6c48-5eb5-42f0-a46a-0a726d213341" (UID: "544a6c48-5eb5-42f0-a46a-0a726d213341"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.616477 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "544a6c48-5eb5-42f0-a46a-0a726d213341" (UID: "544a6c48-5eb5-42f0-a46a-0a726d213341"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.616688 4776 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.616708 4776 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.617162 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "544a6c48-5eb5-42f0-a46a-0a726d213341" (UID: "544a6c48-5eb5-42f0-a46a-0a726d213341"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.619594 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-config" (OuterVolumeSpecName: "config") pod "544a6c48-5eb5-42f0-a46a-0a726d213341" (UID: "544a6c48-5eb5-42f0-a46a-0a726d213341"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.619718 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/544a6c48-5eb5-42f0-a46a-0a726d213341-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "544a6c48-5eb5-42f0-a46a-0a726d213341" (UID: "544a6c48-5eb5-42f0-a46a-0a726d213341"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.619718 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/544a6c48-5eb5-42f0-a46a-0a726d213341-config-out" (OuterVolumeSpecName: "config-out") pod "544a6c48-5eb5-42f0-a46a-0a726d213341" (UID: "544a6c48-5eb5-42f0-a46a-0a726d213341"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.625714 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "544a6c48-5eb5-42f0-a46a-0a726d213341" (UID: "544a6c48-5eb5-42f0-a46a-0a726d213341"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.625822 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/544a6c48-5eb5-42f0-a46a-0a726d213341-kube-api-access-f4246" (OuterVolumeSpecName: "kube-api-access-f4246") pod "544a6c48-5eb5-42f0-a46a-0a726d213341" (UID: "544a6c48-5eb5-42f0-a46a-0a726d213341"). InnerVolumeSpecName "kube-api-access-f4246". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.649803 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-web-config" (OuterVolumeSpecName: "web-config") pod "544a6c48-5eb5-42f0-a46a-0a726d213341" (UID: "544a6c48-5eb5-42f0-a46a-0a726d213341"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.668528 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "544a6c48-5eb5-42f0-a46a-0a726d213341" (UID: "544a6c48-5eb5-42f0-a46a-0a726d213341"). InnerVolumeSpecName "pvc-28337057-3ad3-471e-9736-ebdaa343fbf9". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.669011 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.718657 4776 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/544a6c48-5eb5-42f0-a46a-0a726d213341-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.718699 4776 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.718716 4776 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/544a6c48-5eb5-42f0-a46a-0a726d213341-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.718731 4776 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/544a6c48-5eb5-42f0-a46a-0a726d213341-config-out\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.718780 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") on node \"crc\" " Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.718797 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4246\" (UniqueName: \"kubernetes.io/projected/544a6c48-5eb5-42f0-a46a-0a726d213341-kube-api-access-f4246\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.718812 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.718822 4776 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/544a6c48-5eb5-42f0-a46a-0a726d213341-web-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.741027 4776 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.741217 4776 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-28337057-3ad3-471e-9736-ebdaa343fbf9" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9") on node "crc" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.769834 4776 generic.go:334] "Generic (PLEG): container finished" podID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerID="00e723df3061d416830cea707b8efa99babb8de270569f1ff9c43c05bfba0967" exitCode=0 Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.769899 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"544a6c48-5eb5-42f0-a46a-0a726d213341","Type":"ContainerDied","Data":"00e723df3061d416830cea707b8efa99babb8de270569f1ff9c43c05bfba0967"} Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.769903 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.769943 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"544a6c48-5eb5-42f0-a46a-0a726d213341","Type":"ContainerDied","Data":"b03badf491eb8fef69f99c8b719a8486d3e96012a7607158885cf521b5b4a741"} Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.769963 4776 scope.go:117] "RemoveContainer" containerID="893f47538cfb3d831779b83f9531245334da025f1f314bfe2726f8b5b6afbb8a" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.804524 4776 scope.go:117] "RemoveContainer" containerID="00e723df3061d416830cea707b8efa99babb8de270569f1ff9c43c05bfba0967" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.813535 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.820158 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.821033 4776 reconciler_common.go:293] "Volume detached for volume \"pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.841438 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 07:07:13 crc kubenswrapper[4776]: E0128 07:07:13.842008 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerName="config-reloader" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.842023 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerName="config-reloader" Jan 28 07:07:13 crc kubenswrapper[4776]: E0128 07:07:13.842039 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerName="init-config-reloader" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.842051 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerName="init-config-reloader" Jan 28 07:07:13 crc kubenswrapper[4776]: E0128 07:07:13.842063 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerName="thanos-sidecar" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.842069 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerName="thanos-sidecar" Jan 28 07:07:13 crc kubenswrapper[4776]: E0128 07:07:13.842085 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerName="prometheus" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.842090 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerName="prometheus" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.842242 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerName="thanos-sidecar" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.842257 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerName="prometheus" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.842269 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="544a6c48-5eb5-42f0-a46a-0a726d213341" containerName="config-reloader" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.843565 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.846373 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.846443 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.846459 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.846567 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.846673 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.846888 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.846959 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.847127 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-lmp7n" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.852148 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.869986 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.922539 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-config\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.922694 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/da477959-63db-4b5e-aef0-ca65915e6c3a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.922737 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.922765 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/da477959-63db-4b5e-aef0-ca65915e6c3a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.922796 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.922861 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.922901 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.922923 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9t4z\" (UniqueName: \"kubernetes.io/projected/da477959-63db-4b5e-aef0-ca65915e6c3a-kube-api-access-f9t4z\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.922950 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.922977 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.923003 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.923026 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:13 crc kubenswrapper[4776]: I0128 07:07:13.923052 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.024357 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.024431 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.024457 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9t4z\" (UniqueName: \"kubernetes.io/projected/da477959-63db-4b5e-aef0-ca65915e6c3a-kube-api-access-f9t4z\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.024480 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.024502 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.024521 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.024540 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.024582 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.024614 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-config\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.024639 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/da477959-63db-4b5e-aef0-ca65915e6c3a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.024663 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.024683 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/da477959-63db-4b5e-aef0-ca65915e6c3a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.024706 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.025691 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.026903 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.026991 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.031325 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/da477959-63db-4b5e-aef0-ca65915e6c3a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.033021 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-config\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.033733 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.034091 4776 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.034135 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2e9c055391cfae11cfbe6abdf3b945738020df5dfe58cd3a482199e94820340b/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.034279 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.034289 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.039508 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.039609 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/da477959-63db-4b5e-aef0-ca65915e6c3a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.039894 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.042517 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9t4z\" (UniqueName: \"kubernetes.io/projected/da477959-63db-4b5e-aef0-ca65915e6c3a-kube-api-access-f9t4z\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.069603 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") pod \"prometheus-metric-storage-0\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.093212 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-lpldc"] Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.094898 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lpldc" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.098429 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.114312 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lpldc"] Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.125652 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcxkw\" (UniqueName: \"kubernetes.io/projected/1e0efa46-d84b-4268-9162-1a8363ed4eed-kube-api-access-xcxkw\") pod \"root-account-create-update-lpldc\" (UID: \"1e0efa46-d84b-4268-9162-1a8363ed4eed\") " pod="openstack/root-account-create-update-lpldc" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.125758 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e0efa46-d84b-4268-9162-1a8363ed4eed-operator-scripts\") pod \"root-account-create-update-lpldc\" (UID: \"1e0efa46-d84b-4268-9162-1a8363ed4eed\") " pod="openstack/root-account-create-update-lpldc" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.160442 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.228000 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e0efa46-d84b-4268-9162-1a8363ed4eed-operator-scripts\") pod \"root-account-create-update-lpldc\" (UID: \"1e0efa46-d84b-4268-9162-1a8363ed4eed\") " pod="openstack/root-account-create-update-lpldc" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.228101 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcxkw\" (UniqueName: \"kubernetes.io/projected/1e0efa46-d84b-4268-9162-1a8363ed4eed-kube-api-access-xcxkw\") pod \"root-account-create-update-lpldc\" (UID: \"1e0efa46-d84b-4268-9162-1a8363ed4eed\") " pod="openstack/root-account-create-update-lpldc" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.229514 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e0efa46-d84b-4268-9162-1a8363ed4eed-operator-scripts\") pod \"root-account-create-update-lpldc\" (UID: \"1e0efa46-d84b-4268-9162-1a8363ed4eed\") " pod="openstack/root-account-create-update-lpldc" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.258838 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcxkw\" (UniqueName: \"kubernetes.io/projected/1e0efa46-d84b-4268-9162-1a8363ed4eed-kube-api-access-xcxkw\") pod \"root-account-create-update-lpldc\" (UID: \"1e0efa46-d84b-4268-9162-1a8363ed4eed\") " pod="openstack/root-account-create-update-lpldc" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.311026 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lrzjl-config-6626j"] Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.422886 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lpldc" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.495603 4776 scope.go:117] "RemoveContainer" containerID="88bc359bfc0f27f92955825e4558579ce8d0019689935da73440c43347421876" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.590045 4776 scope.go:117] "RemoveContainer" containerID="35c90678e12bea8f8da79ced725be71f64fcdb713c8b62dacebabb5f70075b29" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.646967 4776 scope.go:117] "RemoveContainer" containerID="893f47538cfb3d831779b83f9531245334da025f1f314bfe2726f8b5b6afbb8a" Jan 28 07:07:14 crc kubenswrapper[4776]: E0128 07:07:14.647602 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"893f47538cfb3d831779b83f9531245334da025f1f314bfe2726f8b5b6afbb8a\": container with ID starting with 893f47538cfb3d831779b83f9531245334da025f1f314bfe2726f8b5b6afbb8a not found: ID does not exist" containerID="893f47538cfb3d831779b83f9531245334da025f1f314bfe2726f8b5b6afbb8a" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.647646 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"893f47538cfb3d831779b83f9531245334da025f1f314bfe2726f8b5b6afbb8a"} err="failed to get container status \"893f47538cfb3d831779b83f9531245334da025f1f314bfe2726f8b5b6afbb8a\": rpc error: code = NotFound desc = could not find container \"893f47538cfb3d831779b83f9531245334da025f1f314bfe2726f8b5b6afbb8a\": container with ID starting with 893f47538cfb3d831779b83f9531245334da025f1f314bfe2726f8b5b6afbb8a not found: ID does not exist" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.647679 4776 scope.go:117] "RemoveContainer" containerID="00e723df3061d416830cea707b8efa99babb8de270569f1ff9c43c05bfba0967" Jan 28 07:07:14 crc kubenswrapper[4776]: E0128 07:07:14.647991 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00e723df3061d416830cea707b8efa99babb8de270569f1ff9c43c05bfba0967\": container with ID starting with 00e723df3061d416830cea707b8efa99babb8de270569f1ff9c43c05bfba0967 not found: ID does not exist" containerID="00e723df3061d416830cea707b8efa99babb8de270569f1ff9c43c05bfba0967" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.648022 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e723df3061d416830cea707b8efa99babb8de270569f1ff9c43c05bfba0967"} err="failed to get container status \"00e723df3061d416830cea707b8efa99babb8de270569f1ff9c43c05bfba0967\": rpc error: code = NotFound desc = could not find container \"00e723df3061d416830cea707b8efa99babb8de270569f1ff9c43c05bfba0967\": container with ID starting with 00e723df3061d416830cea707b8efa99babb8de270569f1ff9c43c05bfba0967 not found: ID does not exist" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.648047 4776 scope.go:117] "RemoveContainer" containerID="88bc359bfc0f27f92955825e4558579ce8d0019689935da73440c43347421876" Jan 28 07:07:14 crc kubenswrapper[4776]: E0128 07:07:14.648825 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88bc359bfc0f27f92955825e4558579ce8d0019689935da73440c43347421876\": container with ID starting with 88bc359bfc0f27f92955825e4558579ce8d0019689935da73440c43347421876 not found: ID does not exist" containerID="88bc359bfc0f27f92955825e4558579ce8d0019689935da73440c43347421876" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.648854 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88bc359bfc0f27f92955825e4558579ce8d0019689935da73440c43347421876"} err="failed to get container status \"88bc359bfc0f27f92955825e4558579ce8d0019689935da73440c43347421876\": rpc error: code = NotFound desc = could not find container \"88bc359bfc0f27f92955825e4558579ce8d0019689935da73440c43347421876\": container with ID starting with 88bc359bfc0f27f92955825e4558579ce8d0019689935da73440c43347421876 not found: ID does not exist" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.648872 4776 scope.go:117] "RemoveContainer" containerID="35c90678e12bea8f8da79ced725be71f64fcdb713c8b62dacebabb5f70075b29" Jan 28 07:07:14 crc kubenswrapper[4776]: E0128 07:07:14.649109 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35c90678e12bea8f8da79ced725be71f64fcdb713c8b62dacebabb5f70075b29\": container with ID starting with 35c90678e12bea8f8da79ced725be71f64fcdb713c8b62dacebabb5f70075b29 not found: ID does not exist" containerID="35c90678e12bea8f8da79ced725be71f64fcdb713c8b62dacebabb5f70075b29" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.649146 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35c90678e12bea8f8da79ced725be71f64fcdb713c8b62dacebabb5f70075b29"} err="failed to get container status \"35c90678e12bea8f8da79ced725be71f64fcdb713c8b62dacebabb5f70075b29\": rpc error: code = NotFound desc = could not find container \"35c90678e12bea8f8da79ced725be71f64fcdb713c8b62dacebabb5f70075b29\": container with ID starting with 35c90678e12bea8f8da79ced725be71f64fcdb713c8b62dacebabb5f70075b29 not found: ID does not exist" Jan 28 07:07:14 crc kubenswrapper[4776]: I0128 07:07:14.787444 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lrzjl-config-6626j" event={"ID":"c2d61b61-941b-4635-92fb-2425af02183b","Type":"ContainerStarted","Data":"a22503e08f21fa564cf97d3aba8c1e1c5a66fc76abd45752609f67eaa304f36a"} Jan 28 07:07:15 crc kubenswrapper[4776]: I0128 07:07:15.034031 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lpldc"] Jan 28 07:07:15 crc kubenswrapper[4776]: W0128 07:07:15.038119 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e0efa46_d84b_4268_9162_1a8363ed4eed.slice/crio-125c0c06d2c9ad08c668d60073c8f08e0afc4b813c437faedae783dea737b03d WatchSource:0}: Error finding container 125c0c06d2c9ad08c668d60073c8f08e0afc4b813c437faedae783dea737b03d: Status 404 returned error can't find the container with id 125c0c06d2c9ad08c668d60073c8f08e0afc4b813c437faedae783dea737b03d Jan 28 07:07:15 crc kubenswrapper[4776]: I0128 07:07:15.108519 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 07:07:15 crc kubenswrapper[4776]: W0128 07:07:15.115819 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda477959_63db_4b5e_aef0_ca65915e6c3a.slice/crio-30b3ba66e268e8279a8c8d06018dba5266fcb1cc572cef6631f11b05c219dd62 WatchSource:0}: Error finding container 30b3ba66e268e8279a8c8d06018dba5266fcb1cc572cef6631f11b05c219dd62: Status 404 returned error can't find the container with id 30b3ba66e268e8279a8c8d06018dba5266fcb1cc572cef6631f11b05c219dd62 Jan 28 07:07:15 crc kubenswrapper[4776]: I0128 07:07:15.319876 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="368217e4-4227-4a27-a06f-6289ea05414a" path="/var/lib/kubelet/pods/368217e4-4227-4a27-a06f-6289ea05414a/volumes" Jan 28 07:07:15 crc kubenswrapper[4776]: I0128 07:07:15.320979 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="544a6c48-5eb5-42f0-a46a-0a726d213341" path="/var/lib/kubelet/pods/544a6c48-5eb5-42f0-a46a-0a726d213341/volumes" Jan 28 07:07:15 crc kubenswrapper[4776]: I0128 07:07:15.800913 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"331ac509-cce0-4545-ac41-1224aae65295","Type":"ContainerStarted","Data":"7b6555e074cf63d60abbc292bc64e699cbb85a8266b533513f8ba0b0d302858c"} Jan 28 07:07:15 crc kubenswrapper[4776]: I0128 07:07:15.801342 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"331ac509-cce0-4545-ac41-1224aae65295","Type":"ContainerStarted","Data":"e448510ba4a954803c02c6d6de9180ea2d5098983ea2b83c3f01349a8dbde6a9"} Jan 28 07:07:15 crc kubenswrapper[4776]: I0128 07:07:15.801406 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"331ac509-cce0-4545-ac41-1224aae65295","Type":"ContainerStarted","Data":"52ab9de8310e25c281d237610fe17bdcb87f1492ddc90ca178984a4821a353be"} Jan 28 07:07:15 crc kubenswrapper[4776]: I0128 07:07:15.801434 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"331ac509-cce0-4545-ac41-1224aae65295","Type":"ContainerStarted","Data":"e83ec6c150afb67d5afc3475a93cd75e64167929c42983a71866ffc545815449"} Jan 28 07:07:15 crc kubenswrapper[4776]: I0128 07:07:15.803674 4776 generic.go:334] "Generic (PLEG): container finished" podID="c2d61b61-941b-4635-92fb-2425af02183b" containerID="08a95b6bd6ee8af42cfb318de3d9ecce10ec1906ab11a18056ad3c82c763e397" exitCode=0 Jan 28 07:07:15 crc kubenswrapper[4776]: I0128 07:07:15.804272 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lrzjl-config-6626j" event={"ID":"c2d61b61-941b-4635-92fb-2425af02183b","Type":"ContainerDied","Data":"08a95b6bd6ee8af42cfb318de3d9ecce10ec1906ab11a18056ad3c82c763e397"} Jan 28 07:07:15 crc kubenswrapper[4776]: I0128 07:07:15.806866 4776 generic.go:334] "Generic (PLEG): container finished" podID="1e0efa46-d84b-4268-9162-1a8363ed4eed" containerID="a0d7b5f7851761c9bbfecc56e844c0e290ef65346011ef326ecf8dae468a6995" exitCode=0 Jan 28 07:07:15 crc kubenswrapper[4776]: I0128 07:07:15.806964 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lpldc" event={"ID":"1e0efa46-d84b-4268-9162-1a8363ed4eed","Type":"ContainerDied","Data":"a0d7b5f7851761c9bbfecc56e844c0e290ef65346011ef326ecf8dae468a6995"} Jan 28 07:07:15 crc kubenswrapper[4776]: I0128 07:07:15.806991 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lpldc" event={"ID":"1e0efa46-d84b-4268-9162-1a8363ed4eed","Type":"ContainerStarted","Data":"125c0c06d2c9ad08c668d60073c8f08e0afc4b813c437faedae783dea737b03d"} Jan 28 07:07:15 crc kubenswrapper[4776]: I0128 07:07:15.808598 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"da477959-63db-4b5e-aef0-ca65915e6c3a","Type":"ContainerStarted","Data":"30b3ba66e268e8279a8c8d06018dba5266fcb1cc572cef6631f11b05c219dd62"} Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.445748 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.451286 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lpldc" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.505115 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcxkw\" (UniqueName: \"kubernetes.io/projected/1e0efa46-d84b-4268-9162-1a8363ed4eed-kube-api-access-xcxkw\") pod \"1e0efa46-d84b-4268-9162-1a8363ed4eed\" (UID: \"1e0efa46-d84b-4268-9162-1a8363ed4eed\") " Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.505230 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-log-ovn\") pod \"c2d61b61-941b-4635-92fb-2425af02183b\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.505305 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c2d61b61-941b-4635-92fb-2425af02183b-additional-scripts\") pod \"c2d61b61-941b-4635-92fb-2425af02183b\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.505354 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2d61b61-941b-4635-92fb-2425af02183b-scripts\") pod \"c2d61b61-941b-4635-92fb-2425af02183b\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.505386 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-run\") pod \"c2d61b61-941b-4635-92fb-2425af02183b\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.505448 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v9dg\" (UniqueName: \"kubernetes.io/projected/c2d61b61-941b-4635-92fb-2425af02183b-kube-api-access-8v9dg\") pod \"c2d61b61-941b-4635-92fb-2425af02183b\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.505583 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-run-ovn\") pod \"c2d61b61-941b-4635-92fb-2425af02183b\" (UID: \"c2d61b61-941b-4635-92fb-2425af02183b\") " Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.505673 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e0efa46-d84b-4268-9162-1a8363ed4eed-operator-scripts\") pod \"1e0efa46-d84b-4268-9162-1a8363ed4eed\" (UID: \"1e0efa46-d84b-4268-9162-1a8363ed4eed\") " Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.506715 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2d61b61-941b-4635-92fb-2425af02183b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c2d61b61-941b-4635-92fb-2425af02183b" (UID: "c2d61b61-941b-4635-92fb-2425af02183b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.506811 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e0efa46-d84b-4268-9162-1a8363ed4eed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e0efa46-d84b-4268-9162-1a8363ed4eed" (UID: "1e0efa46-d84b-4268-9162-1a8363ed4eed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.507516 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c2d61b61-941b-4635-92fb-2425af02183b" (UID: "c2d61b61-941b-4635-92fb-2425af02183b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.508884 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c2d61b61-941b-4635-92fb-2425af02183b" (UID: "c2d61b61-941b-4635-92fb-2425af02183b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.508949 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-run" (OuterVolumeSpecName: "var-run") pod "c2d61b61-941b-4635-92fb-2425af02183b" (UID: "c2d61b61-941b-4635-92fb-2425af02183b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.509018 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2d61b61-941b-4635-92fb-2425af02183b-scripts" (OuterVolumeSpecName: "scripts") pod "c2d61b61-941b-4635-92fb-2425af02183b" (UID: "c2d61b61-941b-4635-92fb-2425af02183b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.512217 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d61b61-941b-4635-92fb-2425af02183b-kube-api-access-8v9dg" (OuterVolumeSpecName: "kube-api-access-8v9dg") pod "c2d61b61-941b-4635-92fb-2425af02183b" (UID: "c2d61b61-941b-4635-92fb-2425af02183b"). InnerVolumeSpecName "kube-api-access-8v9dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.513748 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e0efa46-d84b-4268-9162-1a8363ed4eed-kube-api-access-xcxkw" (OuterVolumeSpecName: "kube-api-access-xcxkw") pod "1e0efa46-d84b-4268-9162-1a8363ed4eed" (UID: "1e0efa46-d84b-4268-9162-1a8363ed4eed"). InnerVolumeSpecName "kube-api-access-xcxkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.607528 4776 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-run\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.607575 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2d61b61-941b-4635-92fb-2425af02183b-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.607585 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v9dg\" (UniqueName: \"kubernetes.io/projected/c2d61b61-941b-4635-92fb-2425af02183b-kube-api-access-8v9dg\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.607594 4776 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.607603 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e0efa46-d84b-4268-9162-1a8363ed4eed-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.607612 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcxkw\" (UniqueName: \"kubernetes.io/projected/1e0efa46-d84b-4268-9162-1a8363ed4eed-kube-api-access-xcxkw\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.607620 4776 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c2d61b61-941b-4635-92fb-2425af02183b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.607630 4776 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c2d61b61-941b-4635-92fb-2425af02183b-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.849152 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"331ac509-cce0-4545-ac41-1224aae65295","Type":"ContainerStarted","Data":"6b620f49f163ce7347268d41d575f10f10ac4c9dc325c051041dd4c313fdda7e"} Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.849507 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"331ac509-cce0-4545-ac41-1224aae65295","Type":"ContainerStarted","Data":"e8647c7b7681fda644a64a559b69177370138b253cd26c69f462a4034d731829"} Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.851818 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lrzjl-config-6626j" event={"ID":"c2d61b61-941b-4635-92fb-2425af02183b","Type":"ContainerDied","Data":"a22503e08f21fa564cf97d3aba8c1e1c5a66fc76abd45752609f67eaa304f36a"} Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.851849 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a22503e08f21fa564cf97d3aba8c1e1c5a66fc76abd45752609f67eaa304f36a" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.851876 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lrzjl-config-6626j" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.861305 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lpldc" event={"ID":"1e0efa46-d84b-4268-9162-1a8363ed4eed","Type":"ContainerDied","Data":"125c0c06d2c9ad08c668d60073c8f08e0afc4b813c437faedae783dea737b03d"} Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.861343 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="125c0c06d2c9ad08c668d60073c8f08e0afc4b813c437faedae783dea737b03d" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.861420 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lpldc" Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.878450 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"da477959-63db-4b5e-aef0-ca65915e6c3a","Type":"ContainerStarted","Data":"1ae72341d18eb712fe275dd8bae7d20c86127dcbe2a92b3a0e87866d61572b00"} Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.894978 4776 generic.go:334] "Generic (PLEG): container finished" podID="d8e5e104-629f-43f5-8372-dbe94e3938af" containerID="9407c576485ccdcd24787765b27e173c32a324fd9d1c97fec5310897541eb1f2" exitCode=0 Jan 28 07:07:17 crc kubenswrapper[4776]: I0128 07:07:17.895025 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s2gfx" event={"ID":"d8e5e104-629f-43f5-8372-dbe94e3938af","Type":"ContainerDied","Data":"9407c576485ccdcd24787765b27e173c32a324fd9d1c97fec5310897541eb1f2"} Jan 28 07:07:18 crc kubenswrapper[4776]: I0128 07:07:18.543329 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lrzjl-config-6626j"] Jan 28 07:07:18 crc kubenswrapper[4776]: I0128 07:07:18.551423 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lrzjl-config-6626j"] Jan 28 07:07:18 crc kubenswrapper[4776]: I0128 07:07:18.914670 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"331ac509-cce0-4545-ac41-1224aae65295","Type":"ContainerStarted","Data":"003473e8ee3409332d9ecb3de0c3ab4b0afa58ebd8adba02f159f0a5a384544c"} Jan 28 07:07:18 crc kubenswrapper[4776]: I0128 07:07:18.914709 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"331ac509-cce0-4545-ac41-1224aae65295","Type":"ContainerStarted","Data":"5b98b616157df76140ba8efbb8fb24283817047e3a58bbbd3e3f43a62beb9ef3"} Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.314258 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d61b61-941b-4635-92fb-2425af02183b" path="/var/lib/kubelet/pods/c2d61b61-941b-4635-92fb-2425af02183b/volumes" Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.469534 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s2gfx" Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.537657 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-combined-ca-bundle\") pod \"d8e5e104-629f-43f5-8372-dbe94e3938af\" (UID: \"d8e5e104-629f-43f5-8372-dbe94e3938af\") " Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.537917 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xgmd\" (UniqueName: \"kubernetes.io/projected/d8e5e104-629f-43f5-8372-dbe94e3938af-kube-api-access-8xgmd\") pod \"d8e5e104-629f-43f5-8372-dbe94e3938af\" (UID: \"d8e5e104-629f-43f5-8372-dbe94e3938af\") " Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.537992 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-db-sync-config-data\") pod \"d8e5e104-629f-43f5-8372-dbe94e3938af\" (UID: \"d8e5e104-629f-43f5-8372-dbe94e3938af\") " Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.538435 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-config-data\") pod \"d8e5e104-629f-43f5-8372-dbe94e3938af\" (UID: \"d8e5e104-629f-43f5-8372-dbe94e3938af\") " Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.556768 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8e5e104-629f-43f5-8372-dbe94e3938af-kube-api-access-8xgmd" (OuterVolumeSpecName: "kube-api-access-8xgmd") pod "d8e5e104-629f-43f5-8372-dbe94e3938af" (UID: "d8e5e104-629f-43f5-8372-dbe94e3938af"). InnerVolumeSpecName "kube-api-access-8xgmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.559670 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d8e5e104-629f-43f5-8372-dbe94e3938af" (UID: "d8e5e104-629f-43f5-8372-dbe94e3938af"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.586392 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-config-data" (OuterVolumeSpecName: "config-data") pod "d8e5e104-629f-43f5-8372-dbe94e3938af" (UID: "d8e5e104-629f-43f5-8372-dbe94e3938af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.590145 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8e5e104-629f-43f5-8372-dbe94e3938af" (UID: "d8e5e104-629f-43f5-8372-dbe94e3938af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.641302 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xgmd\" (UniqueName: \"kubernetes.io/projected/d8e5e104-629f-43f5-8372-dbe94e3938af-kube-api-access-8xgmd\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.641346 4776 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.641362 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.641374 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8e5e104-629f-43f5-8372-dbe94e3938af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.932016 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"331ac509-cce0-4545-ac41-1224aae65295","Type":"ContainerStarted","Data":"00ea7ad0f45ce6d1043d1c9a3a105f4c266431d03cfa3925d380f448b190270b"} Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.932060 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"331ac509-cce0-4545-ac41-1224aae65295","Type":"ContainerStarted","Data":"05243ed791e2799145183bfc9c4d4d6946eef9083a416e1cbbd5279917215400"} Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.932070 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"331ac509-cce0-4545-ac41-1224aae65295","Type":"ContainerStarted","Data":"048bb2fbbf32076f160c127e0626ef68771b2abb380450904dbf1b52591fd7d9"} Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.935264 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s2gfx" event={"ID":"d8e5e104-629f-43f5-8372-dbe94e3938af","Type":"ContainerDied","Data":"66bf839771bf4a9776f8cc7ff1a24c09f93cd4751e3e738c4b0636dbc657f560"} Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.935290 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66bf839771bf4a9776f8cc7ff1a24c09f93cd4751e3e738c4b0636dbc657f560" Jan 28 07:07:19 crc kubenswrapper[4776]: I0128 07:07:19.935349 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s2gfx" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.350939 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-fvpc7"] Jan 28 07:07:20 crc kubenswrapper[4776]: E0128 07:07:20.351522 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d61b61-941b-4635-92fb-2425af02183b" containerName="ovn-config" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.351537 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d61b61-941b-4635-92fb-2425af02183b" containerName="ovn-config" Jan 28 07:07:20 crc kubenswrapper[4776]: E0128 07:07:20.364743 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e0efa46-d84b-4268-9162-1a8363ed4eed" containerName="mariadb-account-create-update" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.364774 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e0efa46-d84b-4268-9162-1a8363ed4eed" containerName="mariadb-account-create-update" Jan 28 07:07:20 crc kubenswrapper[4776]: E0128 07:07:20.364804 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e5e104-629f-43f5-8372-dbe94e3938af" containerName="glance-db-sync" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.364811 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e5e104-629f-43f5-8372-dbe94e3938af" containerName="glance-db-sync" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.365071 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e0efa46-d84b-4268-9162-1a8363ed4eed" containerName="mariadb-account-create-update" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.365094 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8e5e104-629f-43f5-8372-dbe94e3938af" containerName="glance-db-sync" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.365107 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d61b61-941b-4635-92fb-2425af02183b" containerName="ovn-config" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.365917 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-fvpc7"] Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.365996 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.452567 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-fvpc7\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.452636 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-fvpc7\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.452686 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-fvpc7\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.452714 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7z7l\" (UniqueName: \"kubernetes.io/projected/5b1e1f65-5eae-4b82-b9b0-85e357a45188-kube-api-access-z7z7l\") pod \"dnsmasq-dns-5b946c75cc-fvpc7\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.452743 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-config\") pod \"dnsmasq-dns-5b946c75cc-fvpc7\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.554607 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-fvpc7\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.554694 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-fvpc7\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.554767 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-fvpc7\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.554809 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7z7l\" (UniqueName: \"kubernetes.io/projected/5b1e1f65-5eae-4b82-b9b0-85e357a45188-kube-api-access-z7z7l\") pod \"dnsmasq-dns-5b946c75cc-fvpc7\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.554839 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-config\") pod \"dnsmasq-dns-5b946c75cc-fvpc7\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.555827 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-config\") pod \"dnsmasq-dns-5b946c75cc-fvpc7\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.556359 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-fvpc7\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.556355 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-fvpc7\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.556894 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-fvpc7\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.588342 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7z7l\" (UniqueName: \"kubernetes.io/projected/5b1e1f65-5eae-4b82-b9b0-85e357a45188-kube-api-access-z7z7l\") pod \"dnsmasq-dns-5b946c75cc-fvpc7\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.749080 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.961646 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"331ac509-cce0-4545-ac41-1224aae65295","Type":"ContainerStarted","Data":"bd01dfbcbaa1f2948b5fd3630654987459db735d40ea145f72fbd96e120fa25f"} Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.961692 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"331ac509-cce0-4545-ac41-1224aae65295","Type":"ContainerStarted","Data":"ba6e7db081a57124ba4492075d3d349d2339064481e13dc7d79a5a26e0bf82d7"} Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.961707 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"331ac509-cce0-4545-ac41-1224aae65295","Type":"ContainerStarted","Data":"091748758a01359b684d6e12c5331b8a09d42c9f99b5d86bc35aec0ea711c6ef"} Jan 28 07:07:20 crc kubenswrapper[4776]: I0128 07:07:20.961717 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"331ac509-cce0-4545-ac41-1224aae65295","Type":"ContainerStarted","Data":"eabf52d05aa5619cd74c4f746ead44ccd5e6fc9f99d74806fd9c7650f1482d7c"} Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:20.998806 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.235885926 podStartE2EDuration="43.998785779s" podCreationTimestamp="2026-01-28 07:06:37 +0000 UTC" firstStartedPulling="2026-01-28 07:07:11.574467964 +0000 UTC m=+1002.990128114" lastFinishedPulling="2026-01-28 07:07:19.337367797 +0000 UTC m=+1010.753027967" observedRunningTime="2026-01-28 07:07:20.994459522 +0000 UTC m=+1012.410119712" watchObservedRunningTime="2026-01-28 07:07:20.998785779 +0000 UTC m=+1012.414445939" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.187110 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-fvpc7"] Jan 28 07:07:21 crc kubenswrapper[4776]: W0128 07:07:21.188020 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b1e1f65_5eae_4b82_b9b0_85e357a45188.slice/crio-a92114272b95d3e413a6b6d30bb1826fe13397f833691a49892da1f76c55915b WatchSource:0}: Error finding container a92114272b95d3e413a6b6d30bb1826fe13397f833691a49892da1f76c55915b: Status 404 returned error can't find the container with id a92114272b95d3e413a6b6d30bb1826fe13397f833691a49892da1f76c55915b Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.387327 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-fvpc7"] Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.429469 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-hf97r"] Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.430827 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.432230 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.443156 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-hf97r"] Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.576320 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.576370 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.576390 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.576426 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.576471 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-config\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.576532 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69j9m\" (UniqueName: \"kubernetes.io/projected/4cb1a460-72c7-4fc9-9a41-f92d30d63444-kube-api-access-69j9m\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.678109 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.678161 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.678218 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.678633 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-config\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.679043 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.679218 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.679277 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69j9m\" (UniqueName: \"kubernetes.io/projected/4cb1a460-72c7-4fc9-9a41-f92d30d63444-kube-api-access-69j9m\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.679392 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.679388 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-config\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.679866 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.680152 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.695105 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69j9m\" (UniqueName: \"kubernetes.io/projected/4cb1a460-72c7-4fc9-9a41-f92d30d63444-kube-api-access-69j9m\") pod \"dnsmasq-dns-74f6bcbc87-hf97r\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.769348 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.985090 4776 generic.go:334] "Generic (PLEG): container finished" podID="5b1e1f65-5eae-4b82-b9b0-85e357a45188" containerID="b6447a100b124c82d85a4580ba855beb85390c14a7027c41c746bbc5cf136631" exitCode=0 Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.985316 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" event={"ID":"5b1e1f65-5eae-4b82-b9b0-85e357a45188","Type":"ContainerDied","Data":"b6447a100b124c82d85a4580ba855beb85390c14a7027c41c746bbc5cf136631"} Jan 28 07:07:21 crc kubenswrapper[4776]: I0128 07:07:21.985351 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" event={"ID":"5b1e1f65-5eae-4b82-b9b0-85e357a45188","Type":"ContainerStarted","Data":"a92114272b95d3e413a6b6d30bb1826fe13397f833691a49892da1f76c55915b"} Jan 28 07:07:22 crc kubenswrapper[4776]: I0128 07:07:22.234817 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-hf97r"] Jan 28 07:07:22 crc kubenswrapper[4776]: W0128 07:07:22.236073 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cb1a460_72c7_4fc9_9a41_f92d30d63444.slice/crio-2071f6840968e3b8a24e2c9d352df16170c5d0fbc373518d35d43437e6c3afb9 WatchSource:0}: Error finding container 2071f6840968e3b8a24e2c9d352df16170c5d0fbc373518d35d43437e6c3afb9: Status 404 returned error can't find the container with id 2071f6840968e3b8a24e2c9d352df16170c5d0fbc373518d35d43437e6c3afb9 Jan 28 07:07:22 crc kubenswrapper[4776]: I0128 07:07:22.541754 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 28 07:07:22 crc kubenswrapper[4776]: I0128 07:07:22.792793 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:07:22 crc kubenswrapper[4776]: I0128 07:07:22.850894 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-kqfn7"] Jan 28 07:07:22 crc kubenswrapper[4776]: I0128 07:07:22.851952 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-kqfn7" Jan 28 07:07:22 crc kubenswrapper[4776]: I0128 07:07:22.855229 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Jan 28 07:07:22 crc kubenswrapper[4776]: I0128 07:07:22.856079 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-hcstr" Jan 28 07:07:22 crc kubenswrapper[4776]: I0128 07:07:22.876989 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-kqfn7"] Jan 28 07:07:22 crc kubenswrapper[4776]: I0128 07:07:22.918003 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-nskdh"] Jan 28 07:07:22 crc kubenswrapper[4776]: I0128 07:07:22.919074 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nskdh" Jan 28 07:07:22 crc kubenswrapper[4776]: I0128 07:07:22.936248 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nskdh"] Jan 28 07:07:22 crc kubenswrapper[4776]: I0128 07:07:22.994454 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" event={"ID":"5b1e1f65-5eae-4b82-b9b0-85e357a45188","Type":"ContainerStarted","Data":"1d9a33e3bff5df2ff2f2919d036ef0535971314643a012d155444deaacdbf9e2"} Jan 28 07:07:22 crc kubenswrapper[4776]: I0128 07:07:22.994585 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" podUID="5b1e1f65-5eae-4b82-b9b0-85e357a45188" containerName="dnsmasq-dns" containerID="cri-o://1d9a33e3bff5df2ff2f2919d036ef0535971314643a012d155444deaacdbf9e2" gracePeriod=10 Jan 28 07:07:22 crc kubenswrapper[4776]: I0128 07:07:22.994811 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:22 crc kubenswrapper[4776]: I0128 07:07:22.999091 4776 generic.go:334] "Generic (PLEG): container finished" podID="4cb1a460-72c7-4fc9-9a41-f92d30d63444" containerID="81684da9b9cefbdb2d3933b15b710c4e87d68943bb5b190940da8cfe42b7cdd4" exitCode=0 Jan 28 07:07:22 crc kubenswrapper[4776]: I0128 07:07:22.999141 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" event={"ID":"4cb1a460-72c7-4fc9-9a41-f92d30d63444","Type":"ContainerDied","Data":"81684da9b9cefbdb2d3933b15b710c4e87d68943bb5b190940da8cfe42b7cdd4"} Jan 28 07:07:22 crc kubenswrapper[4776]: I0128 07:07:22.999167 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" event={"ID":"4cb1a460-72c7-4fc9-9a41-f92d30d63444","Type":"ContainerStarted","Data":"2071f6840968e3b8a24e2c9d352df16170c5d0fbc373518d35d43437e6c3afb9"} Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.013320 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xz29\" (UniqueName: \"kubernetes.io/projected/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-kube-api-access-9xz29\") pod \"watcher-db-sync-kqfn7\" (UID: \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\") " pod="openstack/watcher-db-sync-kqfn7" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.013400 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff88b930-92ae-409c-9365-c9a0131558cb-operator-scripts\") pod \"cinder-db-create-nskdh\" (UID: \"ff88b930-92ae-409c-9365-c9a0131558cb\") " pod="openstack/cinder-db-create-nskdh" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.013457 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbcgk\" (UniqueName: \"kubernetes.io/projected/ff88b930-92ae-409c-9365-c9a0131558cb-kube-api-access-qbcgk\") pod \"cinder-db-create-nskdh\" (UID: \"ff88b930-92ae-409c-9365-c9a0131558cb\") " pod="openstack/cinder-db-create-nskdh" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.013568 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-db-sync-config-data\") pod \"watcher-db-sync-kqfn7\" (UID: \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\") " pod="openstack/watcher-db-sync-kqfn7" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.013744 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-combined-ca-bundle\") pod \"watcher-db-sync-kqfn7\" (UID: \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\") " pod="openstack/watcher-db-sync-kqfn7" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.013883 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-config-data\") pod \"watcher-db-sync-kqfn7\" (UID: \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\") " pod="openstack/watcher-db-sync-kqfn7" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.019635 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-lpldc"] Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.034487 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-lpldc"] Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.034783 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" podStartSLOduration=3.034766502 podStartE2EDuration="3.034766502s" podCreationTimestamp="2026-01-28 07:07:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:07:23.023088754 +0000 UTC m=+1014.438748914" watchObservedRunningTime="2026-01-28 07:07:23.034766502 +0000 UTC m=+1014.450426662" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.112855 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d143-account-create-update-2s4kf"] Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.114123 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d143-account-create-update-2s4kf" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.115490 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-combined-ca-bundle\") pod \"watcher-db-sync-kqfn7\" (UID: \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\") " pod="openstack/watcher-db-sync-kqfn7" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.117950 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.123492 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-combined-ca-bundle\") pod \"watcher-db-sync-kqfn7\" (UID: \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\") " pod="openstack/watcher-db-sync-kqfn7" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.125586 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-config-data\") pod \"watcher-db-sync-kqfn7\" (UID: \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\") " pod="openstack/watcher-db-sync-kqfn7" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.125690 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xz29\" (UniqueName: \"kubernetes.io/projected/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-kube-api-access-9xz29\") pod \"watcher-db-sync-kqfn7\" (UID: \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\") " pod="openstack/watcher-db-sync-kqfn7" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.125784 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff88b930-92ae-409c-9365-c9a0131558cb-operator-scripts\") pod \"cinder-db-create-nskdh\" (UID: \"ff88b930-92ae-409c-9365-c9a0131558cb\") " pod="openstack/cinder-db-create-nskdh" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.125813 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbcgk\" (UniqueName: \"kubernetes.io/projected/ff88b930-92ae-409c-9365-c9a0131558cb-kube-api-access-qbcgk\") pod \"cinder-db-create-nskdh\" (UID: \"ff88b930-92ae-409c-9365-c9a0131558cb\") " pod="openstack/cinder-db-create-nskdh" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.125866 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-db-sync-config-data\") pod \"watcher-db-sync-kqfn7\" (UID: \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\") " pod="openstack/watcher-db-sync-kqfn7" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.127421 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff88b930-92ae-409c-9365-c9a0131558cb-operator-scripts\") pod \"cinder-db-create-nskdh\" (UID: \"ff88b930-92ae-409c-9365-c9a0131558cb\") " pod="openstack/cinder-db-create-nskdh" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.133092 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-db-sync-config-data\") pod \"watcher-db-sync-kqfn7\" (UID: \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\") " pod="openstack/watcher-db-sync-kqfn7" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.134079 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-config-data\") pod \"watcher-db-sync-kqfn7\" (UID: \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\") " pod="openstack/watcher-db-sync-kqfn7" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.142617 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d143-account-create-update-2s4kf"] Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.163015 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xz29\" (UniqueName: \"kubernetes.io/projected/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-kube-api-access-9xz29\") pod \"watcher-db-sync-kqfn7\" (UID: \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\") " pod="openstack/watcher-db-sync-kqfn7" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.168908 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbcgk\" (UniqueName: \"kubernetes.io/projected/ff88b930-92ae-409c-9365-c9a0131558cb-kube-api-access-qbcgk\") pod \"cinder-db-create-nskdh\" (UID: \"ff88b930-92ae-409c-9365-c9a0131558cb\") " pod="openstack/cinder-db-create-nskdh" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.181365 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-kqfn7" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.189520 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-bmgbw"] Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.194328 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bmgbw" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.221496 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bmgbw"] Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.228536 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0d5e297-fedb-45dc-a861-423f2cbe5700-operator-scripts\") pod \"neutron-d143-account-create-update-2s4kf\" (UID: \"e0d5e297-fedb-45dc-a861-423f2cbe5700\") " pod="openstack/neutron-d143-account-create-update-2s4kf" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.228592 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nnpb\" (UniqueName: \"kubernetes.io/projected/e0d5e297-fedb-45dc-a861-423f2cbe5700-kube-api-access-8nnpb\") pod \"neutron-d143-account-create-update-2s4kf\" (UID: \"e0d5e297-fedb-45dc-a861-423f2cbe5700\") " pod="openstack/neutron-d143-account-create-update-2s4kf" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.235657 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-1464-account-create-update-6k48w"] Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.235909 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nskdh" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.236914 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1464-account-create-update-6k48w" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.246845 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.250611 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1464-account-create-update-6k48w"] Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.304381 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-lm5q5"] Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.305675 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lm5q5" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.312199 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.312498 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.312694 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.312733 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-52vd7" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.331176 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf7qd\" (UniqueName: \"kubernetes.io/projected/19c08552-4143-40ef-bb55-76c8d5146a7c-kube-api-access-zf7qd\") pod \"cinder-1464-account-create-update-6k48w\" (UID: \"19c08552-4143-40ef-bb55-76c8d5146a7c\") " pod="openstack/cinder-1464-account-create-update-6k48w" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.339877 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19c08552-4143-40ef-bb55-76c8d5146a7c-operator-scripts\") pod \"cinder-1464-account-create-update-6k48w\" (UID: \"19c08552-4143-40ef-bb55-76c8d5146a7c\") " pod="openstack/cinder-1464-account-create-update-6k48w" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.339941 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d514d3-c80d-400a-adaa-2b7adf96aab8-operator-scripts\") pod \"barbican-db-create-bmgbw\" (UID: \"54d514d3-c80d-400a-adaa-2b7adf96aab8\") " pod="openstack/barbican-db-create-bmgbw" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.340027 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzzq8\" (UniqueName: \"kubernetes.io/projected/54d514d3-c80d-400a-adaa-2b7adf96aab8-kube-api-access-kzzq8\") pod \"barbican-db-create-bmgbw\" (UID: \"54d514d3-c80d-400a-adaa-2b7adf96aab8\") " pod="openstack/barbican-db-create-bmgbw" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.340094 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0d5e297-fedb-45dc-a861-423f2cbe5700-operator-scripts\") pod \"neutron-d143-account-create-update-2s4kf\" (UID: \"e0d5e297-fedb-45dc-a861-423f2cbe5700\") " pod="openstack/neutron-d143-account-create-update-2s4kf" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.340121 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nnpb\" (UniqueName: \"kubernetes.io/projected/e0d5e297-fedb-45dc-a861-423f2cbe5700-kube-api-access-8nnpb\") pod \"neutron-d143-account-create-update-2s4kf\" (UID: \"e0d5e297-fedb-45dc-a861-423f2cbe5700\") " pod="openstack/neutron-d143-account-create-update-2s4kf" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.341001 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0d5e297-fedb-45dc-a861-423f2cbe5700-operator-scripts\") pod \"neutron-d143-account-create-update-2s4kf\" (UID: \"e0d5e297-fedb-45dc-a861-423f2cbe5700\") " pod="openstack/neutron-d143-account-create-update-2s4kf" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.345269 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e0efa46-d84b-4268-9162-1a8363ed4eed" path="/var/lib/kubelet/pods/1e0efa46-d84b-4268-9162-1a8363ed4eed/volumes" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.345748 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-6ljzs"] Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.353353 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-lm5q5"] Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.353481 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6ljzs" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.388324 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6ljzs"] Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.409815 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nnpb\" (UniqueName: \"kubernetes.io/projected/e0d5e297-fedb-45dc-a861-423f2cbe5700-kube-api-access-8nnpb\") pod \"neutron-d143-account-create-update-2s4kf\" (UID: \"e0d5e297-fedb-45dc-a861-423f2cbe5700\") " pod="openstack/neutron-d143-account-create-update-2s4kf" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.410889 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-bfdc-account-create-update-bzmmp"] Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.412383 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bfdc-account-create-update-bzmmp" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.415185 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.440666 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bfdc-account-create-update-bzmmp"] Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.441739 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99zzl\" (UniqueName: \"kubernetes.io/projected/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-kube-api-access-99zzl\") pod \"keystone-db-sync-lm5q5\" (UID: \"03934a8f-5036-4cf4-9dea-8f1bde73b2d7\") " pod="openstack/keystone-db-sync-lm5q5" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.441786 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-combined-ca-bundle\") pod \"keystone-db-sync-lm5q5\" (UID: \"03934a8f-5036-4cf4-9dea-8f1bde73b2d7\") " pod="openstack/keystone-db-sync-lm5q5" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.441811 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-config-data\") pod \"keystone-db-sync-lm5q5\" (UID: \"03934a8f-5036-4cf4-9dea-8f1bde73b2d7\") " pod="openstack/keystone-db-sync-lm5q5" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.441866 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf7qd\" (UniqueName: \"kubernetes.io/projected/19c08552-4143-40ef-bb55-76c8d5146a7c-kube-api-access-zf7qd\") pod \"cinder-1464-account-create-update-6k48w\" (UID: \"19c08552-4143-40ef-bb55-76c8d5146a7c\") " pod="openstack/cinder-1464-account-create-update-6k48w" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.441915 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19c08552-4143-40ef-bb55-76c8d5146a7c-operator-scripts\") pod \"cinder-1464-account-create-update-6k48w\" (UID: \"19c08552-4143-40ef-bb55-76c8d5146a7c\") " pod="openstack/cinder-1464-account-create-update-6k48w" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.441943 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d514d3-c80d-400a-adaa-2b7adf96aab8-operator-scripts\") pod \"barbican-db-create-bmgbw\" (UID: \"54d514d3-c80d-400a-adaa-2b7adf96aab8\") " pod="openstack/barbican-db-create-bmgbw" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.441989 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpvdq\" (UniqueName: \"kubernetes.io/projected/de153a8c-d3be-4575-bc85-4d4b08bdf05c-kube-api-access-gpvdq\") pod \"neutron-db-create-6ljzs\" (UID: \"de153a8c-d3be-4575-bc85-4d4b08bdf05c\") " pod="openstack/neutron-db-create-6ljzs" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.442015 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzzq8\" (UniqueName: \"kubernetes.io/projected/54d514d3-c80d-400a-adaa-2b7adf96aab8-kube-api-access-kzzq8\") pod \"barbican-db-create-bmgbw\" (UID: \"54d514d3-c80d-400a-adaa-2b7adf96aab8\") " pod="openstack/barbican-db-create-bmgbw" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.442037 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de153a8c-d3be-4575-bc85-4d4b08bdf05c-operator-scripts\") pod \"neutron-db-create-6ljzs\" (UID: \"de153a8c-d3be-4575-bc85-4d4b08bdf05c\") " pod="openstack/neutron-db-create-6ljzs" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.443185 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19c08552-4143-40ef-bb55-76c8d5146a7c-operator-scripts\") pod \"cinder-1464-account-create-update-6k48w\" (UID: \"19c08552-4143-40ef-bb55-76c8d5146a7c\") " pod="openstack/cinder-1464-account-create-update-6k48w" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.443760 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d514d3-c80d-400a-adaa-2b7adf96aab8-operator-scripts\") pod \"barbican-db-create-bmgbw\" (UID: \"54d514d3-c80d-400a-adaa-2b7adf96aab8\") " pod="openstack/barbican-db-create-bmgbw" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.472361 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzzq8\" (UniqueName: \"kubernetes.io/projected/54d514d3-c80d-400a-adaa-2b7adf96aab8-kube-api-access-kzzq8\") pod \"barbican-db-create-bmgbw\" (UID: \"54d514d3-c80d-400a-adaa-2b7adf96aab8\") " pod="openstack/barbican-db-create-bmgbw" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.488712 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf7qd\" (UniqueName: \"kubernetes.io/projected/19c08552-4143-40ef-bb55-76c8d5146a7c-kube-api-access-zf7qd\") pod \"cinder-1464-account-create-update-6k48w\" (UID: \"19c08552-4143-40ef-bb55-76c8d5146a7c\") " pod="openstack/cinder-1464-account-create-update-6k48w" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.547487 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c47ae6-fdae-4e7c-b4d3-a8faf292b443-operator-scripts\") pod \"barbican-bfdc-account-create-update-bzmmp\" (UID: \"a2c47ae6-fdae-4e7c-b4d3-a8faf292b443\") " pod="openstack/barbican-bfdc-account-create-update-bzmmp" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.547556 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpvdq\" (UniqueName: \"kubernetes.io/projected/de153a8c-d3be-4575-bc85-4d4b08bdf05c-kube-api-access-gpvdq\") pod \"neutron-db-create-6ljzs\" (UID: \"de153a8c-d3be-4575-bc85-4d4b08bdf05c\") " pod="openstack/neutron-db-create-6ljzs" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.547607 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de153a8c-d3be-4575-bc85-4d4b08bdf05c-operator-scripts\") pod \"neutron-db-create-6ljzs\" (UID: \"de153a8c-d3be-4575-bc85-4d4b08bdf05c\") " pod="openstack/neutron-db-create-6ljzs" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.547664 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99zzl\" (UniqueName: \"kubernetes.io/projected/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-kube-api-access-99zzl\") pod \"keystone-db-sync-lm5q5\" (UID: \"03934a8f-5036-4cf4-9dea-8f1bde73b2d7\") " pod="openstack/keystone-db-sync-lm5q5" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.547687 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-combined-ca-bundle\") pod \"keystone-db-sync-lm5q5\" (UID: \"03934a8f-5036-4cf4-9dea-8f1bde73b2d7\") " pod="openstack/keystone-db-sync-lm5q5" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.547703 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-config-data\") pod \"keystone-db-sync-lm5q5\" (UID: \"03934a8f-5036-4cf4-9dea-8f1bde73b2d7\") " pod="openstack/keystone-db-sync-lm5q5" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.547773 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzsql\" (UniqueName: \"kubernetes.io/projected/a2c47ae6-fdae-4e7c-b4d3-a8faf292b443-kube-api-access-tzsql\") pod \"barbican-bfdc-account-create-update-bzmmp\" (UID: \"a2c47ae6-fdae-4e7c-b4d3-a8faf292b443\") " pod="openstack/barbican-bfdc-account-create-update-bzmmp" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.552299 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d143-account-create-update-2s4kf" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.552321 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de153a8c-d3be-4575-bc85-4d4b08bdf05c-operator-scripts\") pod \"neutron-db-create-6ljzs\" (UID: \"de153a8c-d3be-4575-bc85-4d4b08bdf05c\") " pod="openstack/neutron-db-create-6ljzs" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.553304 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-combined-ca-bundle\") pod \"keystone-db-sync-lm5q5\" (UID: \"03934a8f-5036-4cf4-9dea-8f1bde73b2d7\") " pod="openstack/keystone-db-sync-lm5q5" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.563270 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-config-data\") pod \"keystone-db-sync-lm5q5\" (UID: \"03934a8f-5036-4cf4-9dea-8f1bde73b2d7\") " pod="openstack/keystone-db-sync-lm5q5" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.568180 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpvdq\" (UniqueName: \"kubernetes.io/projected/de153a8c-d3be-4575-bc85-4d4b08bdf05c-kube-api-access-gpvdq\") pod \"neutron-db-create-6ljzs\" (UID: \"de153a8c-d3be-4575-bc85-4d4b08bdf05c\") " pod="openstack/neutron-db-create-6ljzs" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.580090 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99zzl\" (UniqueName: \"kubernetes.io/projected/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-kube-api-access-99zzl\") pod \"keystone-db-sync-lm5q5\" (UID: \"03934a8f-5036-4cf4-9dea-8f1bde73b2d7\") " pod="openstack/keystone-db-sync-lm5q5" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.603218 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6ljzs" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.607905 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bmgbw" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.650780 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzsql\" (UniqueName: \"kubernetes.io/projected/a2c47ae6-fdae-4e7c-b4d3-a8faf292b443-kube-api-access-tzsql\") pod \"barbican-bfdc-account-create-update-bzmmp\" (UID: \"a2c47ae6-fdae-4e7c-b4d3-a8faf292b443\") " pod="openstack/barbican-bfdc-account-create-update-bzmmp" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.650922 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c47ae6-fdae-4e7c-b4d3-a8faf292b443-operator-scripts\") pod \"barbican-bfdc-account-create-update-bzmmp\" (UID: \"a2c47ae6-fdae-4e7c-b4d3-a8faf292b443\") " pod="openstack/barbican-bfdc-account-create-update-bzmmp" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.651756 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c47ae6-fdae-4e7c-b4d3-a8faf292b443-operator-scripts\") pod \"barbican-bfdc-account-create-update-bzmmp\" (UID: \"a2c47ae6-fdae-4e7c-b4d3-a8faf292b443\") " pod="openstack/barbican-bfdc-account-create-update-bzmmp" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.667651 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.668732 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzsql\" (UniqueName: \"kubernetes.io/projected/a2c47ae6-fdae-4e7c-b4d3-a8faf292b443-kube-api-access-tzsql\") pod \"barbican-bfdc-account-create-update-bzmmp\" (UID: \"a2c47ae6-fdae-4e7c-b4d3-a8faf292b443\") " pod="openstack/barbican-bfdc-account-create-update-bzmmp" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.688905 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1464-account-create-update-6k48w" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.753315 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-ovsdbserver-sb\") pod \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.753713 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7z7l\" (UniqueName: \"kubernetes.io/projected/5b1e1f65-5eae-4b82-b9b0-85e357a45188-kube-api-access-z7z7l\") pod \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.753846 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-dns-svc\") pod \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.753924 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-config\") pod \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.754006 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-ovsdbserver-nb\") pod \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\" (UID: \"5b1e1f65-5eae-4b82-b9b0-85e357a45188\") " Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.764490 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lm5q5" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.773112 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b1e1f65-5eae-4b82-b9b0-85e357a45188-kube-api-access-z7z7l" (OuterVolumeSpecName: "kube-api-access-z7z7l") pod "5b1e1f65-5eae-4b82-b9b0-85e357a45188" (UID: "5b1e1f65-5eae-4b82-b9b0-85e357a45188"). InnerVolumeSpecName "kube-api-access-z7z7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.843157 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-config" (OuterVolumeSpecName: "config") pod "5b1e1f65-5eae-4b82-b9b0-85e357a45188" (UID: "5b1e1f65-5eae-4b82-b9b0-85e357a45188"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.845284 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5b1e1f65-5eae-4b82-b9b0-85e357a45188" (UID: "5b1e1f65-5eae-4b82-b9b0-85e357a45188"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.854107 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5b1e1f65-5eae-4b82-b9b0-85e357a45188" (UID: "5b1e1f65-5eae-4b82-b9b0-85e357a45188"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.855219 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5b1e1f65-5eae-4b82-b9b0-85e357a45188" (UID: "5b1e1f65-5eae-4b82-b9b0-85e357a45188"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.856402 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7z7l\" (UniqueName: \"kubernetes.io/projected/5b1e1f65-5eae-4b82-b9b0-85e357a45188-kube-api-access-z7z7l\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.856425 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.856434 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.856444 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.856453 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b1e1f65-5eae-4b82-b9b0-85e357a45188-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:23 crc kubenswrapper[4776]: I0128 07:07:23.915071 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bfdc-account-create-update-bzmmp" Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.020198 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" event={"ID":"4cb1a460-72c7-4fc9-9a41-f92d30d63444","Type":"ContainerStarted","Data":"f6e78ddf1d896fb0f6b71d14e9adc6d134ce7ec5db4388a24ae61e954d2df014"} Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.023475 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.034002 4776 generic.go:334] "Generic (PLEG): container finished" podID="5b1e1f65-5eae-4b82-b9b0-85e357a45188" containerID="1d9a33e3bff5df2ff2f2919d036ef0535971314643a012d155444deaacdbf9e2" exitCode=0 Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.034051 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" event={"ID":"5b1e1f65-5eae-4b82-b9b0-85e357a45188","Type":"ContainerDied","Data":"1d9a33e3bff5df2ff2f2919d036ef0535971314643a012d155444deaacdbf9e2"} Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.034076 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" event={"ID":"5b1e1f65-5eae-4b82-b9b0-85e357a45188","Type":"ContainerDied","Data":"a92114272b95d3e413a6b6d30bb1826fe13397f833691a49892da1f76c55915b"} Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.034091 4776 scope.go:117] "RemoveContainer" containerID="1d9a33e3bff5df2ff2f2919d036ef0535971314643a012d155444deaacdbf9e2" Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.034209 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-fvpc7" Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.077873 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" podStartSLOduration=3.077855141 podStartE2EDuration="3.077855141s" podCreationTimestamp="2026-01-28 07:07:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:07:24.07633137 +0000 UTC m=+1015.491991530" watchObservedRunningTime="2026-01-28 07:07:24.077855141 +0000 UTC m=+1015.493515301" Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.095009 4776 scope.go:117] "RemoveContainer" containerID="b6447a100b124c82d85a4580ba855beb85390c14a7027c41c746bbc5cf136631" Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.116771 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-fvpc7"] Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.128974 4776 scope.go:117] "RemoveContainer" containerID="1d9a33e3bff5df2ff2f2919d036ef0535971314643a012d155444deaacdbf9e2" Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.131428 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-fvpc7"] Jan 28 07:07:24 crc kubenswrapper[4776]: E0128 07:07:24.133438 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d9a33e3bff5df2ff2f2919d036ef0535971314643a012d155444deaacdbf9e2\": container with ID starting with 1d9a33e3bff5df2ff2f2919d036ef0535971314643a012d155444deaacdbf9e2 not found: ID does not exist" containerID="1d9a33e3bff5df2ff2f2919d036ef0535971314643a012d155444deaacdbf9e2" Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.133489 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9a33e3bff5df2ff2f2919d036ef0535971314643a012d155444deaacdbf9e2"} err="failed to get container status \"1d9a33e3bff5df2ff2f2919d036ef0535971314643a012d155444deaacdbf9e2\": rpc error: code = NotFound desc = could not find container \"1d9a33e3bff5df2ff2f2919d036ef0535971314643a012d155444deaacdbf9e2\": container with ID starting with 1d9a33e3bff5df2ff2f2919d036ef0535971314643a012d155444deaacdbf9e2 not found: ID does not exist" Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.133520 4776 scope.go:117] "RemoveContainer" containerID="b6447a100b124c82d85a4580ba855beb85390c14a7027c41c746bbc5cf136631" Jan 28 07:07:24 crc kubenswrapper[4776]: E0128 07:07:24.135945 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6447a100b124c82d85a4580ba855beb85390c14a7027c41c746bbc5cf136631\": container with ID starting with b6447a100b124c82d85a4580ba855beb85390c14a7027c41c746bbc5cf136631 not found: ID does not exist" containerID="b6447a100b124c82d85a4580ba855beb85390c14a7027c41c746bbc5cf136631" Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.135971 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6447a100b124c82d85a4580ba855beb85390c14a7027c41c746bbc5cf136631"} err="failed to get container status \"b6447a100b124c82d85a4580ba855beb85390c14a7027c41c746bbc5cf136631\": rpc error: code = NotFound desc = could not find container \"b6447a100b124c82d85a4580ba855beb85390c14a7027c41c746bbc5cf136631\": container with ID starting with b6447a100b124c82d85a4580ba855beb85390c14a7027c41c746bbc5cf136631 not found: ID does not exist" Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.237106 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nskdh"] Jan 28 07:07:24 crc kubenswrapper[4776]: W0128 07:07:24.243033 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff88b930_92ae_409c_9365_c9a0131558cb.slice/crio-b25b6b4e32c3665c49cb9c85a966a3ed0e56fbdc01e5e3d86cac28e30d6a9d55 WatchSource:0}: Error finding container b25b6b4e32c3665c49cb9c85a966a3ed0e56fbdc01e5e3d86cac28e30d6a9d55: Status 404 returned error can't find the container with id b25b6b4e32c3665c49cb9c85a966a3ed0e56fbdc01e5e3d86cac28e30d6a9d55 Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.251892 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-kqfn7"] Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.333183 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6ljzs"] Jan 28 07:07:24 crc kubenswrapper[4776]: W0128 07:07:24.339288 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0d5e297_fedb_45dc_a861_423f2cbe5700.slice/crio-442c6d59163ae92639560e5a470d7aab7e18ad3ff3fa0dca36d17c9290607cc4 WatchSource:0}: Error finding container 442c6d59163ae92639560e5a470d7aab7e18ad3ff3fa0dca36d17c9290607cc4: Status 404 returned error can't find the container with id 442c6d59163ae92639560e5a470d7aab7e18ad3ff3fa0dca36d17c9290607cc4 Jan 28 07:07:24 crc kubenswrapper[4776]: W0128 07:07:24.343696 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde153a8c_d3be_4575_bc85_4d4b08bdf05c.slice/crio-cfecc5cba3c226a8f7dd0fcabfdf51d60de73df1f715acaade0bf2e01a4a6c7f WatchSource:0}: Error finding container cfecc5cba3c226a8f7dd0fcabfdf51d60de73df1f715acaade0bf2e01a4a6c7f: Status 404 returned error can't find the container with id cfecc5cba3c226a8f7dd0fcabfdf51d60de73df1f715acaade0bf2e01a4a6c7f Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.348365 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d143-account-create-update-2s4kf"] Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.534633 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1464-account-create-update-6k48w"] Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.553280 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-lm5q5"] Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.602844 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bmgbw"] Jan 28 07:07:24 crc kubenswrapper[4776]: W0128 07:07:24.655230 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03934a8f_5036_4cf4_9dea_8f1bde73b2d7.slice/crio-eb15f55ac1446e5af1e671e8e81e1bb0739ffafdfd1252d2d01925dac513204c WatchSource:0}: Error finding container eb15f55ac1446e5af1e671e8e81e1bb0739ffafdfd1252d2d01925dac513204c: Status 404 returned error can't find the container with id eb15f55ac1446e5af1e671e8e81e1bb0739ffafdfd1252d2d01925dac513204c Jan 28 07:07:24 crc kubenswrapper[4776]: I0128 07:07:24.699501 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bfdc-account-create-update-bzmmp"] Jan 28 07:07:24 crc kubenswrapper[4776]: W0128 07:07:24.753851 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2c47ae6_fdae_4e7c_b4d3_a8faf292b443.slice/crio-98121a1d89ad0373e8e4d73730a4b3eec8da41840a376e366ab78577594f435f WatchSource:0}: Error finding container 98121a1d89ad0373e8e4d73730a4b3eec8da41840a376e366ab78577594f435f: Status 404 returned error can't find the container with id 98121a1d89ad0373e8e4d73730a4b3eec8da41840a376e366ab78577594f435f Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.060968 4776 generic.go:334] "Generic (PLEG): container finished" podID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerID="1ae72341d18eb712fe275dd8bae7d20c86127dcbe2a92b3a0e87866d61572b00" exitCode=0 Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.061072 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"da477959-63db-4b5e-aef0-ca65915e6c3a","Type":"ContainerDied","Data":"1ae72341d18eb712fe275dd8bae7d20c86127dcbe2a92b3a0e87866d61572b00"} Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.069829 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bfdc-account-create-update-bzmmp" event={"ID":"a2c47ae6-fdae-4e7c-b4d3-a8faf292b443","Type":"ContainerStarted","Data":"98121a1d89ad0373e8e4d73730a4b3eec8da41840a376e366ab78577594f435f"} Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.073234 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nskdh" event={"ID":"ff88b930-92ae-409c-9365-c9a0131558cb","Type":"ContainerStarted","Data":"428a6bfbb182fcaa6f22bf059573f8b37e01a695e531100ac270f0987a79d309"} Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.073283 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nskdh" event={"ID":"ff88b930-92ae-409c-9365-c9a0131558cb","Type":"ContainerStarted","Data":"b25b6b4e32c3665c49cb9c85a966a3ed0e56fbdc01e5e3d86cac28e30d6a9d55"} Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.075463 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lm5q5" event={"ID":"03934a8f-5036-4cf4-9dea-8f1bde73b2d7","Type":"ContainerStarted","Data":"eb15f55ac1446e5af1e671e8e81e1bb0739ffafdfd1252d2d01925dac513204c"} Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.099864 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d143-account-create-update-2s4kf" event={"ID":"e0d5e297-fedb-45dc-a861-423f2cbe5700","Type":"ContainerStarted","Data":"abb8aa8983884cf9ae6e6f7d8cd64e47f7f5f5e7d22d3522ad1d1070717ca077"} Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.099905 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d143-account-create-update-2s4kf" event={"ID":"e0d5e297-fedb-45dc-a861-423f2cbe5700","Type":"ContainerStarted","Data":"442c6d59163ae92639560e5a470d7aab7e18ad3ff3fa0dca36d17c9290607cc4"} Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.102319 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-kqfn7" event={"ID":"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e","Type":"ContainerStarted","Data":"125a1cd0e1f5d11cf2d1654770399434723ddb4ac31b391bc6f08c080d3966c4"} Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.126674 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6ljzs" event={"ID":"de153a8c-d3be-4575-bc85-4d4b08bdf05c","Type":"ContainerStarted","Data":"79080a7b53a9f2a322741aa184fbd559b9a54427bb29bd067346407e0c890255"} Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.126727 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6ljzs" event={"ID":"de153a8c-d3be-4575-bc85-4d4b08bdf05c","Type":"ContainerStarted","Data":"cfecc5cba3c226a8f7dd0fcabfdf51d60de73df1f715acaade0bf2e01a4a6c7f"} Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.130593 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bmgbw" event={"ID":"54d514d3-c80d-400a-adaa-2b7adf96aab8","Type":"ContainerStarted","Data":"e7421ebad821a5198f3af782efa3442696402f0250ea702de80f398114573c59"} Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.144727 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1464-account-create-update-6k48w" event={"ID":"19c08552-4143-40ef-bb55-76c8d5146a7c","Type":"ContainerStarted","Data":"34199f85b5df395c55405b41e3e0ee84b1e4189317638f251886597895a61300"} Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.144758 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1464-account-create-update-6k48w" event={"ID":"19c08552-4143-40ef-bb55-76c8d5146a7c","Type":"ContainerStarted","Data":"50423020c6ca9395ca6891fa62936104f81c6ff9de6c3b9dea0cd7fa810f9522"} Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.164127 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-d143-account-create-update-2s4kf" podStartSLOduration=2.164106994 podStartE2EDuration="2.164106994s" podCreationTimestamp="2026-01-28 07:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:07:25.122476651 +0000 UTC m=+1016.538136811" watchObservedRunningTime="2026-01-28 07:07:25.164106994 +0000 UTC m=+1016.579767154" Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.172607 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-6ljzs" podStartSLOduration=2.172590265 podStartE2EDuration="2.172590265s" podCreationTimestamp="2026-01-28 07:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:07:25.140854342 +0000 UTC m=+1016.556514502" watchObservedRunningTime="2026-01-28 07:07:25.172590265 +0000 UTC m=+1016.588250425" Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.187992 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-bmgbw" podStartSLOduration=2.187973934 podStartE2EDuration="2.187973934s" podCreationTimestamp="2026-01-28 07:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:07:25.158758479 +0000 UTC m=+1016.574418639" watchObservedRunningTime="2026-01-28 07:07:25.187973934 +0000 UTC m=+1016.603634094" Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.201525 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-1464-account-create-update-6k48w" podStartSLOduration=2.201509682 podStartE2EDuration="2.201509682s" podCreationTimestamp="2026-01-28 07:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:07:25.174111367 +0000 UTC m=+1016.589771527" watchObservedRunningTime="2026-01-28 07:07:25.201509682 +0000 UTC m=+1016.617169842" Jan 28 07:07:25 crc kubenswrapper[4776]: I0128 07:07:25.323126 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b1e1f65-5eae-4b82-b9b0-85e357a45188" path="/var/lib/kubelet/pods/5b1e1f65-5eae-4b82-b9b0-85e357a45188/volumes" Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.152302 4776 generic.go:334] "Generic (PLEG): container finished" podID="19c08552-4143-40ef-bb55-76c8d5146a7c" containerID="34199f85b5df395c55405b41e3e0ee84b1e4189317638f251886597895a61300" exitCode=0 Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.152370 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1464-account-create-update-6k48w" event={"ID":"19c08552-4143-40ef-bb55-76c8d5146a7c","Type":"ContainerDied","Data":"34199f85b5df395c55405b41e3e0ee84b1e4189317638f251886597895a61300"} Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.154399 4776 generic.go:334] "Generic (PLEG): container finished" podID="a2c47ae6-fdae-4e7c-b4d3-a8faf292b443" containerID="c93e9719d7d883ae856c1836e3b2090bdb5a3fdec5cfe4326e6af01888ea04dd" exitCode=0 Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.154609 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bfdc-account-create-update-bzmmp" event={"ID":"a2c47ae6-fdae-4e7c-b4d3-a8faf292b443","Type":"ContainerDied","Data":"c93e9719d7d883ae856c1836e3b2090bdb5a3fdec5cfe4326e6af01888ea04dd"} Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.156805 4776 generic.go:334] "Generic (PLEG): container finished" podID="ff88b930-92ae-409c-9365-c9a0131558cb" containerID="428a6bfbb182fcaa6f22bf059573f8b37e01a695e531100ac270f0987a79d309" exitCode=0 Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.156872 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nskdh" event={"ID":"ff88b930-92ae-409c-9365-c9a0131558cb","Type":"ContainerDied","Data":"428a6bfbb182fcaa6f22bf059573f8b37e01a695e531100ac270f0987a79d309"} Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.158742 4776 generic.go:334] "Generic (PLEG): container finished" podID="54d514d3-c80d-400a-adaa-2b7adf96aab8" containerID="0b48be84415e47cb5cd1cd00f597d80e72f15ae853414ecd14142a591dc802d0" exitCode=0 Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.158808 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bmgbw" event={"ID":"54d514d3-c80d-400a-adaa-2b7adf96aab8","Type":"ContainerDied","Data":"0b48be84415e47cb5cd1cd00f597d80e72f15ae853414ecd14142a591dc802d0"} Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.160172 4776 generic.go:334] "Generic (PLEG): container finished" podID="e0d5e297-fedb-45dc-a861-423f2cbe5700" containerID="abb8aa8983884cf9ae6e6f7d8cd64e47f7f5f5e7d22d3522ad1d1070717ca077" exitCode=0 Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.160216 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d143-account-create-update-2s4kf" event={"ID":"e0d5e297-fedb-45dc-a861-423f2cbe5700","Type":"ContainerDied","Data":"abb8aa8983884cf9ae6e6f7d8cd64e47f7f5f5e7d22d3522ad1d1070717ca077"} Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.161644 4776 generic.go:334] "Generic (PLEG): container finished" podID="de153a8c-d3be-4575-bc85-4d4b08bdf05c" containerID="79080a7b53a9f2a322741aa184fbd559b9a54427bb29bd067346407e0c890255" exitCode=0 Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.161725 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6ljzs" event={"ID":"de153a8c-d3be-4575-bc85-4d4b08bdf05c","Type":"ContainerDied","Data":"79080a7b53a9f2a322741aa184fbd559b9a54427bb29bd067346407e0c890255"} Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.163962 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"da477959-63db-4b5e-aef0-ca65915e6c3a","Type":"ContainerStarted","Data":"31090420bce1d6f3da0ea75cf9ca0a58687b2f7a5dc24e0358130b8383f56904"} Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.545375 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nskdh" Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.616315 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbcgk\" (UniqueName: \"kubernetes.io/projected/ff88b930-92ae-409c-9365-c9a0131558cb-kube-api-access-qbcgk\") pod \"ff88b930-92ae-409c-9365-c9a0131558cb\" (UID: \"ff88b930-92ae-409c-9365-c9a0131558cb\") " Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.616577 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff88b930-92ae-409c-9365-c9a0131558cb-operator-scripts\") pod \"ff88b930-92ae-409c-9365-c9a0131558cb\" (UID: \"ff88b930-92ae-409c-9365-c9a0131558cb\") " Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.617200 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff88b930-92ae-409c-9365-c9a0131558cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff88b930-92ae-409c-9365-c9a0131558cb" (UID: "ff88b930-92ae-409c-9365-c9a0131558cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.620805 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff88b930-92ae-409c-9365-c9a0131558cb-kube-api-access-qbcgk" (OuterVolumeSpecName: "kube-api-access-qbcgk") pod "ff88b930-92ae-409c-9365-c9a0131558cb" (UID: "ff88b930-92ae-409c-9365-c9a0131558cb"). InnerVolumeSpecName "kube-api-access-qbcgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.718949 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff88b930-92ae-409c-9365-c9a0131558cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:26 crc kubenswrapper[4776]: I0128 07:07:26.718984 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbcgk\" (UniqueName: \"kubernetes.io/projected/ff88b930-92ae-409c-9365-c9a0131558cb-kube-api-access-qbcgk\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:27 crc kubenswrapper[4776]: I0128 07:07:27.177456 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nskdh" Jan 28 07:07:27 crc kubenswrapper[4776]: I0128 07:07:27.177511 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nskdh" event={"ID":"ff88b930-92ae-409c-9365-c9a0131558cb","Type":"ContainerDied","Data":"b25b6b4e32c3665c49cb9c85a966a3ed0e56fbdc01e5e3d86cac28e30d6a9d55"} Jan 28 07:07:27 crc kubenswrapper[4776]: I0128 07:07:27.177811 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b25b6b4e32c3665c49cb9c85a966a3ed0e56fbdc01e5e3d86cac28e30d6a9d55" Jan 28 07:07:27 crc kubenswrapper[4776]: I0128 07:07:27.992167 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-z6chw"] Jan 28 07:07:27 crc kubenswrapper[4776]: E0128 07:07:27.993268 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b1e1f65-5eae-4b82-b9b0-85e357a45188" containerName="init" Jan 28 07:07:27 crc kubenswrapper[4776]: I0128 07:07:27.993361 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b1e1f65-5eae-4b82-b9b0-85e357a45188" containerName="init" Jan 28 07:07:27 crc kubenswrapper[4776]: E0128 07:07:27.993473 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b1e1f65-5eae-4b82-b9b0-85e357a45188" containerName="dnsmasq-dns" Jan 28 07:07:27 crc kubenswrapper[4776]: I0128 07:07:27.993552 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b1e1f65-5eae-4b82-b9b0-85e357a45188" containerName="dnsmasq-dns" Jan 28 07:07:27 crc kubenswrapper[4776]: E0128 07:07:27.993661 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff88b930-92ae-409c-9365-c9a0131558cb" containerName="mariadb-database-create" Jan 28 07:07:27 crc kubenswrapper[4776]: I0128 07:07:27.993897 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff88b930-92ae-409c-9365-c9a0131558cb" containerName="mariadb-database-create" Jan 28 07:07:27 crc kubenswrapper[4776]: I0128 07:07:27.994167 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b1e1f65-5eae-4b82-b9b0-85e357a45188" containerName="dnsmasq-dns" Jan 28 07:07:27 crc kubenswrapper[4776]: I0128 07:07:27.994569 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff88b930-92ae-409c-9365-c9a0131558cb" containerName="mariadb-database-create" Jan 28 07:07:27 crc kubenswrapper[4776]: I0128 07:07:27.995445 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z6chw" Jan 28 07:07:28 crc kubenswrapper[4776]: I0128 07:07:28.000856 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 28 07:07:28 crc kubenswrapper[4776]: I0128 07:07:28.006650 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z6chw"] Jan 28 07:07:28 crc kubenswrapper[4776]: I0128 07:07:28.042761 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9e67cfe-459a-4e26-99d6-302c7614acac-operator-scripts\") pod \"root-account-create-update-z6chw\" (UID: \"a9e67cfe-459a-4e26-99d6-302c7614acac\") " pod="openstack/root-account-create-update-z6chw" Jan 28 07:07:28 crc kubenswrapper[4776]: I0128 07:07:28.042819 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plx7t\" (UniqueName: \"kubernetes.io/projected/a9e67cfe-459a-4e26-99d6-302c7614acac-kube-api-access-plx7t\") pod \"root-account-create-update-z6chw\" (UID: \"a9e67cfe-459a-4e26-99d6-302c7614acac\") " pod="openstack/root-account-create-update-z6chw" Jan 28 07:07:28 crc kubenswrapper[4776]: I0128 07:07:28.144520 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9e67cfe-459a-4e26-99d6-302c7614acac-operator-scripts\") pod \"root-account-create-update-z6chw\" (UID: \"a9e67cfe-459a-4e26-99d6-302c7614acac\") " pod="openstack/root-account-create-update-z6chw" Jan 28 07:07:28 crc kubenswrapper[4776]: I0128 07:07:28.144636 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plx7t\" (UniqueName: \"kubernetes.io/projected/a9e67cfe-459a-4e26-99d6-302c7614acac-kube-api-access-plx7t\") pod \"root-account-create-update-z6chw\" (UID: \"a9e67cfe-459a-4e26-99d6-302c7614acac\") " pod="openstack/root-account-create-update-z6chw" Jan 28 07:07:28 crc kubenswrapper[4776]: I0128 07:07:28.147016 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9e67cfe-459a-4e26-99d6-302c7614acac-operator-scripts\") pod \"root-account-create-update-z6chw\" (UID: \"a9e67cfe-459a-4e26-99d6-302c7614acac\") " pod="openstack/root-account-create-update-z6chw" Jan 28 07:07:28 crc kubenswrapper[4776]: I0128 07:07:28.178821 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plx7t\" (UniqueName: \"kubernetes.io/projected/a9e67cfe-459a-4e26-99d6-302c7614acac-kube-api-access-plx7t\") pod \"root-account-create-update-z6chw\" (UID: \"a9e67cfe-459a-4e26-99d6-302c7614acac\") " pod="openstack/root-account-create-update-z6chw" Jan 28 07:07:28 crc kubenswrapper[4776]: I0128 07:07:28.332818 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z6chw" Jan 28 07:07:29 crc kubenswrapper[4776]: I0128 07:07:29.196227 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"da477959-63db-4b5e-aef0-ca65915e6c3a","Type":"ContainerStarted","Data":"f6de7166c91f12ad2d592b26f3bf9d11436b545a336d99ef306cb0a69d3704d1"} Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.017204 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6ljzs" Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.024138 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1464-account-create-update-6k48w" Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.081429 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpvdq\" (UniqueName: \"kubernetes.io/projected/de153a8c-d3be-4575-bc85-4d4b08bdf05c-kube-api-access-gpvdq\") pod \"de153a8c-d3be-4575-bc85-4d4b08bdf05c\" (UID: \"de153a8c-d3be-4575-bc85-4d4b08bdf05c\") " Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.081579 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf7qd\" (UniqueName: \"kubernetes.io/projected/19c08552-4143-40ef-bb55-76c8d5146a7c-kube-api-access-zf7qd\") pod \"19c08552-4143-40ef-bb55-76c8d5146a7c\" (UID: \"19c08552-4143-40ef-bb55-76c8d5146a7c\") " Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.081688 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19c08552-4143-40ef-bb55-76c8d5146a7c-operator-scripts\") pod \"19c08552-4143-40ef-bb55-76c8d5146a7c\" (UID: \"19c08552-4143-40ef-bb55-76c8d5146a7c\") " Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.081720 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de153a8c-d3be-4575-bc85-4d4b08bdf05c-operator-scripts\") pod \"de153a8c-d3be-4575-bc85-4d4b08bdf05c\" (UID: \"de153a8c-d3be-4575-bc85-4d4b08bdf05c\") " Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.082152 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19c08552-4143-40ef-bb55-76c8d5146a7c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "19c08552-4143-40ef-bb55-76c8d5146a7c" (UID: "19c08552-4143-40ef-bb55-76c8d5146a7c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.082191 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de153a8c-d3be-4575-bc85-4d4b08bdf05c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de153a8c-d3be-4575-bc85-4d4b08bdf05c" (UID: "de153a8c-d3be-4575-bc85-4d4b08bdf05c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.087527 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19c08552-4143-40ef-bb55-76c8d5146a7c-kube-api-access-zf7qd" (OuterVolumeSpecName: "kube-api-access-zf7qd") pod "19c08552-4143-40ef-bb55-76c8d5146a7c" (UID: "19c08552-4143-40ef-bb55-76c8d5146a7c"). InnerVolumeSpecName "kube-api-access-zf7qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.097020 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de153a8c-d3be-4575-bc85-4d4b08bdf05c-kube-api-access-gpvdq" (OuterVolumeSpecName: "kube-api-access-gpvdq") pod "de153a8c-d3be-4575-bc85-4d4b08bdf05c" (UID: "de153a8c-d3be-4575-bc85-4d4b08bdf05c"). InnerVolumeSpecName "kube-api-access-gpvdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.183477 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf7qd\" (UniqueName: \"kubernetes.io/projected/19c08552-4143-40ef-bb55-76c8d5146a7c-kube-api-access-zf7qd\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.183844 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19c08552-4143-40ef-bb55-76c8d5146a7c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.183860 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de153a8c-d3be-4575-bc85-4d4b08bdf05c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.183876 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpvdq\" (UniqueName: \"kubernetes.io/projected/de153a8c-d3be-4575-bc85-4d4b08bdf05c-kube-api-access-gpvdq\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.205370 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1464-account-create-update-6k48w" Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.205369 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1464-account-create-update-6k48w" event={"ID":"19c08552-4143-40ef-bb55-76c8d5146a7c","Type":"ContainerDied","Data":"50423020c6ca9395ca6891fa62936104f81c6ff9de6c3b9dea0cd7fa810f9522"} Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.205477 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50423020c6ca9395ca6891fa62936104f81c6ff9de6c3b9dea0cd7fa810f9522" Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.207572 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6ljzs" event={"ID":"de153a8c-d3be-4575-bc85-4d4b08bdf05c","Type":"ContainerDied","Data":"cfecc5cba3c226a8f7dd0fcabfdf51d60de73df1f715acaade0bf2e01a4a6c7f"} Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.207599 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6ljzs" Jan 28 07:07:30 crc kubenswrapper[4776]: I0128 07:07:30.207613 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfecc5cba3c226a8f7dd0fcabfdf51d60de73df1f715acaade0bf2e01a4a6c7f" Jan 28 07:07:31 crc kubenswrapper[4776]: I0128 07:07:31.773417 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:07:31 crc kubenswrapper[4776]: I0128 07:07:31.837718 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-z7ngz"] Jan 28 07:07:31 crc kubenswrapper[4776]: I0128 07:07:31.838185 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-z7ngz" podUID="12e40a7e-8ea9-4135-9ec1-3904792273aa" containerName="dnsmasq-dns" containerID="cri-o://f4394b9a8079a15ae6383b73702a9057bef9dc53788ffa5b52761fb10f4d85d0" gracePeriod=10 Jan 28 07:07:32 crc kubenswrapper[4776]: I0128 07:07:32.244954 4776 generic.go:334] "Generic (PLEG): container finished" podID="12e40a7e-8ea9-4135-9ec1-3904792273aa" containerID="f4394b9a8079a15ae6383b73702a9057bef9dc53788ffa5b52761fb10f4d85d0" exitCode=0 Jan 28 07:07:32 crc kubenswrapper[4776]: I0128 07:07:32.245032 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-z7ngz" event={"ID":"12e40a7e-8ea9-4135-9ec1-3904792273aa","Type":"ContainerDied","Data":"f4394b9a8079a15ae6383b73702a9057bef9dc53788ffa5b52761fb10f4d85d0"} Jan 28 07:07:32 crc kubenswrapper[4776]: I0128 07:07:32.863659 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-z7ngz" podUID="12e40a7e-8ea9-4135-9ec1-3904792273aa" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.118:5353: connect: connection refused" Jan 28 07:07:33 crc kubenswrapper[4776]: I0128 07:07:33.852209 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:07:33 crc kubenswrapper[4776]: I0128 07:07:33.852259 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.269421 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bfdc-account-create-update-bzmmp" event={"ID":"a2c47ae6-fdae-4e7c-b4d3-a8faf292b443","Type":"ContainerDied","Data":"98121a1d89ad0373e8e4d73730a4b3eec8da41840a376e366ab78577594f435f"} Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.269790 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98121a1d89ad0373e8e4d73730a4b3eec8da41840a376e366ab78577594f435f" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.276398 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bmgbw" event={"ID":"54d514d3-c80d-400a-adaa-2b7adf96aab8","Type":"ContainerDied","Data":"e7421ebad821a5198f3af782efa3442696402f0250ea702de80f398114573c59"} Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.276439 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7421ebad821a5198f3af782efa3442696402f0250ea702de80f398114573c59" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.279907 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d143-account-create-update-2s4kf" event={"ID":"e0d5e297-fedb-45dc-a861-423f2cbe5700","Type":"ContainerDied","Data":"442c6d59163ae92639560e5a470d7aab7e18ad3ff3fa0dca36d17c9290607cc4"} Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.279934 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="442c6d59163ae92639560e5a470d7aab7e18ad3ff3fa0dca36d17c9290607cc4" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.316631 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bfdc-account-create-update-bzmmp" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.326680 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bmgbw" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.381564 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d143-account-create-update-2s4kf" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.481503 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d514d3-c80d-400a-adaa-2b7adf96aab8-operator-scripts\") pod \"54d514d3-c80d-400a-adaa-2b7adf96aab8\" (UID: \"54d514d3-c80d-400a-adaa-2b7adf96aab8\") " Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.481628 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c47ae6-fdae-4e7c-b4d3-a8faf292b443-operator-scripts\") pod \"a2c47ae6-fdae-4e7c-b4d3-a8faf292b443\" (UID: \"a2c47ae6-fdae-4e7c-b4d3-a8faf292b443\") " Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.481702 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nnpb\" (UniqueName: \"kubernetes.io/projected/e0d5e297-fedb-45dc-a861-423f2cbe5700-kube-api-access-8nnpb\") pod \"e0d5e297-fedb-45dc-a861-423f2cbe5700\" (UID: \"e0d5e297-fedb-45dc-a861-423f2cbe5700\") " Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.481785 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzsql\" (UniqueName: \"kubernetes.io/projected/a2c47ae6-fdae-4e7c-b4d3-a8faf292b443-kube-api-access-tzsql\") pod \"a2c47ae6-fdae-4e7c-b4d3-a8faf292b443\" (UID: \"a2c47ae6-fdae-4e7c-b4d3-a8faf292b443\") " Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.481869 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzzq8\" (UniqueName: \"kubernetes.io/projected/54d514d3-c80d-400a-adaa-2b7adf96aab8-kube-api-access-kzzq8\") pod \"54d514d3-c80d-400a-adaa-2b7adf96aab8\" (UID: \"54d514d3-c80d-400a-adaa-2b7adf96aab8\") " Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.481958 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0d5e297-fedb-45dc-a861-423f2cbe5700-operator-scripts\") pod \"e0d5e297-fedb-45dc-a861-423f2cbe5700\" (UID: \"e0d5e297-fedb-45dc-a861-423f2cbe5700\") " Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.482333 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d514d3-c80d-400a-adaa-2b7adf96aab8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54d514d3-c80d-400a-adaa-2b7adf96aab8" (UID: "54d514d3-c80d-400a-adaa-2b7adf96aab8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.482403 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0d5e297-fedb-45dc-a861-423f2cbe5700-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0d5e297-fedb-45dc-a861-423f2cbe5700" (UID: "e0d5e297-fedb-45dc-a861-423f2cbe5700"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.482486 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d514d3-c80d-400a-adaa-2b7adf96aab8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.482713 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2c47ae6-fdae-4e7c-b4d3-a8faf292b443-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a2c47ae6-fdae-4e7c-b4d3-a8faf292b443" (UID: "a2c47ae6-fdae-4e7c-b4d3-a8faf292b443"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.487540 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0d5e297-fedb-45dc-a861-423f2cbe5700-kube-api-access-8nnpb" (OuterVolumeSpecName: "kube-api-access-8nnpb") pod "e0d5e297-fedb-45dc-a861-423f2cbe5700" (UID: "e0d5e297-fedb-45dc-a861-423f2cbe5700"). InnerVolumeSpecName "kube-api-access-8nnpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.492981 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54d514d3-c80d-400a-adaa-2b7adf96aab8-kube-api-access-kzzq8" (OuterVolumeSpecName: "kube-api-access-kzzq8") pod "54d514d3-c80d-400a-adaa-2b7adf96aab8" (UID: "54d514d3-c80d-400a-adaa-2b7adf96aab8"). InnerVolumeSpecName "kube-api-access-kzzq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.496471 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c47ae6-fdae-4e7c-b4d3-a8faf292b443-kube-api-access-tzsql" (OuterVolumeSpecName: "kube-api-access-tzsql") pod "a2c47ae6-fdae-4e7c-b4d3-a8faf292b443" (UID: "a2c47ae6-fdae-4e7c-b4d3-a8faf292b443"). InnerVolumeSpecName "kube-api-access-tzsql". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.584716 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c47ae6-fdae-4e7c-b4d3-a8faf292b443-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.584754 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nnpb\" (UniqueName: \"kubernetes.io/projected/e0d5e297-fedb-45dc-a861-423f2cbe5700-kube-api-access-8nnpb\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.584768 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzsql\" (UniqueName: \"kubernetes.io/projected/a2c47ae6-fdae-4e7c-b4d3-a8faf292b443-kube-api-access-tzsql\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.584780 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzzq8\" (UniqueName: \"kubernetes.io/projected/54d514d3-c80d-400a-adaa-2b7adf96aab8-kube-api-access-kzzq8\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:34 crc kubenswrapper[4776]: I0128 07:07:34.584795 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0d5e297-fedb-45dc-a861-423f2cbe5700-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:35 crc kubenswrapper[4776]: I0128 07:07:35.288159 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bfdc-account-create-update-bzmmp" Jan 28 07:07:35 crc kubenswrapper[4776]: I0128 07:07:35.288395 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bmgbw" Jan 28 07:07:35 crc kubenswrapper[4776]: I0128 07:07:35.288519 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d143-account-create-update-2s4kf" Jan 28 07:07:41 crc kubenswrapper[4776]: I0128 07:07:41.550355 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:07:41 crc kubenswrapper[4776]: I0128 07:07:41.729818 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-ovsdbserver-nb\") pod \"12e40a7e-8ea9-4135-9ec1-3904792273aa\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " Jan 28 07:07:41 crc kubenswrapper[4776]: I0128 07:07:41.729898 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbz8q\" (UniqueName: \"kubernetes.io/projected/12e40a7e-8ea9-4135-9ec1-3904792273aa-kube-api-access-vbz8q\") pod \"12e40a7e-8ea9-4135-9ec1-3904792273aa\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " Jan 28 07:07:41 crc kubenswrapper[4776]: I0128 07:07:41.729966 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-ovsdbserver-sb\") pod \"12e40a7e-8ea9-4135-9ec1-3904792273aa\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " Jan 28 07:07:41 crc kubenswrapper[4776]: I0128 07:07:41.730022 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-config\") pod \"12e40a7e-8ea9-4135-9ec1-3904792273aa\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " Jan 28 07:07:41 crc kubenswrapper[4776]: I0128 07:07:41.730086 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-dns-svc\") pod \"12e40a7e-8ea9-4135-9ec1-3904792273aa\" (UID: \"12e40a7e-8ea9-4135-9ec1-3904792273aa\") " Jan 28 07:07:41 crc kubenswrapper[4776]: I0128 07:07:41.737231 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12e40a7e-8ea9-4135-9ec1-3904792273aa-kube-api-access-vbz8q" (OuterVolumeSpecName: "kube-api-access-vbz8q") pod "12e40a7e-8ea9-4135-9ec1-3904792273aa" (UID: "12e40a7e-8ea9-4135-9ec1-3904792273aa"). InnerVolumeSpecName "kube-api-access-vbz8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:41 crc kubenswrapper[4776]: I0128 07:07:41.773272 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-config" (OuterVolumeSpecName: "config") pod "12e40a7e-8ea9-4135-9ec1-3904792273aa" (UID: "12e40a7e-8ea9-4135-9ec1-3904792273aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:41 crc kubenswrapper[4776]: I0128 07:07:41.774490 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "12e40a7e-8ea9-4135-9ec1-3904792273aa" (UID: "12e40a7e-8ea9-4135-9ec1-3904792273aa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:41 crc kubenswrapper[4776]: I0128 07:07:41.783217 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "12e40a7e-8ea9-4135-9ec1-3904792273aa" (UID: "12e40a7e-8ea9-4135-9ec1-3904792273aa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:41 crc kubenswrapper[4776]: I0128 07:07:41.794248 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "12e40a7e-8ea9-4135-9ec1-3904792273aa" (UID: "12e40a7e-8ea9-4135-9ec1-3904792273aa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:41 crc kubenswrapper[4776]: I0128 07:07:41.832413 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:41 crc kubenswrapper[4776]: I0128 07:07:41.832450 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:41 crc kubenswrapper[4776]: I0128 07:07:41.832459 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:41 crc kubenswrapper[4776]: I0128 07:07:41.832470 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12e40a7e-8ea9-4135-9ec1-3904792273aa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:41 crc kubenswrapper[4776]: I0128 07:07:41.832482 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbz8q\" (UniqueName: \"kubernetes.io/projected/12e40a7e-8ea9-4135-9ec1-3904792273aa-kube-api-access-vbz8q\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:42 crc kubenswrapper[4776]: E0128 07:07:42.016679 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.193:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest" Jan 28 07:07:42 crc kubenswrapper[4776]: E0128 07:07:42.016963 4776 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.193:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest" Jan 28 07:07:42 crc kubenswrapper[4776]: E0128 07:07:42.017144 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:watcher-db-sync,Image:38.102.83.193:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/watcher/watcher.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:watcher-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9xz29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-db-sync-kqfn7_openstack(92f68261-4c2f-49dd-84b6-ee2dbd1dc36e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 07:07:42 crc kubenswrapper[4776]: E0128 07:07:42.018709 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/watcher-db-sync-kqfn7" podUID="92f68261-4c2f-49dd-84b6-ee2dbd1dc36e" Jan 28 07:07:42 crc kubenswrapper[4776]: I0128 07:07:42.361684 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-z7ngz" event={"ID":"12e40a7e-8ea9-4135-9ec1-3904792273aa","Type":"ContainerDied","Data":"db10803f5b3508d5557efa2032ff045f1cd5dfa95317cb966efdc9ad0130caa4"} Jan 28 07:07:42 crc kubenswrapper[4776]: I0128 07:07:42.362097 4776 scope.go:117] "RemoveContainer" containerID="f4394b9a8079a15ae6383b73702a9057bef9dc53788ffa5b52761fb10f4d85d0" Jan 28 07:07:42 crc kubenswrapper[4776]: I0128 07:07:42.361975 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-z7ngz" Jan 28 07:07:42 crc kubenswrapper[4776]: I0128 07:07:42.368955 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"da477959-63db-4b5e-aef0-ca65915e6c3a","Type":"ContainerStarted","Data":"f1f1284aa863882b26daa17086dc9c3ff606f662f50e3aeec890ee305e734ccb"} Jan 28 07:07:42 crc kubenswrapper[4776]: I0128 07:07:42.371657 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lm5q5" event={"ID":"03934a8f-5036-4cf4-9dea-8f1bde73b2d7","Type":"ContainerStarted","Data":"5d4bb662b1d9ee7c5bff8bb2f4273626db7157fd8e1ebdf951d5be207f2c737e"} Jan 28 07:07:42 crc kubenswrapper[4776]: E0128 07:07:42.373149 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.193:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest\\\"\"" pod="openstack/watcher-db-sync-kqfn7" podUID="92f68261-4c2f-49dd-84b6-ee2dbd1dc36e" Jan 28 07:07:42 crc kubenswrapper[4776]: I0128 07:07:42.388375 4776 scope.go:117] "RemoveContainer" containerID="e19299793f5af9f7ad66a0415eda83c2e7623f7ea3699ca96a249adc07707741" Jan 28 07:07:42 crc kubenswrapper[4776]: I0128 07:07:42.409489 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=29.409461876 podStartE2EDuration="29.409461876s" podCreationTimestamp="2026-01-28 07:07:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:07:42.400694148 +0000 UTC m=+1033.816354318" watchObservedRunningTime="2026-01-28 07:07:42.409461876 +0000 UTC m=+1033.825122076" Jan 28 07:07:42 crc kubenswrapper[4776]: I0128 07:07:42.441880 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z6chw"] Jan 28 07:07:42 crc kubenswrapper[4776]: I0128 07:07:42.451196 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-lm5q5" podStartSLOduration=2.159095378 podStartE2EDuration="19.451171521s" podCreationTimestamp="2026-01-28 07:07:23 +0000 UTC" firstStartedPulling="2026-01-28 07:07:24.679208412 +0000 UTC m=+1016.094868572" lastFinishedPulling="2026-01-28 07:07:41.971284555 +0000 UTC m=+1033.386944715" observedRunningTime="2026-01-28 07:07:42.435738192 +0000 UTC m=+1033.851398362" watchObservedRunningTime="2026-01-28 07:07:42.451171521 +0000 UTC m=+1033.866831691" Jan 28 07:07:42 crc kubenswrapper[4776]: W0128 07:07:42.455701 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9e67cfe_459a_4e26_99d6_302c7614acac.slice/crio-7e5e8ecb2b74746ba3fe5734b6428c6cf1c47b1a51bcdc7156c24b9af5d2c783 WatchSource:0}: Error finding container 7e5e8ecb2b74746ba3fe5734b6428c6cf1c47b1a51bcdc7156c24b9af5d2c783: Status 404 returned error can't find the container with id 7e5e8ecb2b74746ba3fe5734b6428c6cf1c47b1a51bcdc7156c24b9af5d2c783 Jan 28 07:07:42 crc kubenswrapper[4776]: I0128 07:07:42.462331 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 28 07:07:42 crc kubenswrapper[4776]: I0128 07:07:42.464440 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-z7ngz"] Jan 28 07:07:42 crc kubenswrapper[4776]: I0128 07:07:42.471517 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-z7ngz"] Jan 28 07:07:42 crc kubenswrapper[4776]: I0128 07:07:42.863291 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-z7ngz" podUID="12e40a7e-8ea9-4135-9ec1-3904792273aa" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.118:5353: i/o timeout" Jan 28 07:07:43 crc kubenswrapper[4776]: I0128 07:07:43.332375 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12e40a7e-8ea9-4135-9ec1-3904792273aa" path="/var/lib/kubelet/pods/12e40a7e-8ea9-4135-9ec1-3904792273aa/volumes" Jan 28 07:07:43 crc kubenswrapper[4776]: I0128 07:07:43.381916 4776 generic.go:334] "Generic (PLEG): container finished" podID="a9e67cfe-459a-4e26-99d6-302c7614acac" containerID="606ef16b590f9ea12fb51c089298405f243cb93b27b34961bf04e0cd21fe489b" exitCode=0 Jan 28 07:07:43 crc kubenswrapper[4776]: I0128 07:07:43.381985 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z6chw" event={"ID":"a9e67cfe-459a-4e26-99d6-302c7614acac","Type":"ContainerDied","Data":"606ef16b590f9ea12fb51c089298405f243cb93b27b34961bf04e0cd21fe489b"} Jan 28 07:07:43 crc kubenswrapper[4776]: I0128 07:07:43.382053 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z6chw" event={"ID":"a9e67cfe-459a-4e26-99d6-302c7614acac","Type":"ContainerStarted","Data":"7e5e8ecb2b74746ba3fe5734b6428c6cf1c47b1a51bcdc7156c24b9af5d2c783"} Jan 28 07:07:44 crc kubenswrapper[4776]: I0128 07:07:44.161478 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:44 crc kubenswrapper[4776]: I0128 07:07:44.161577 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:44 crc kubenswrapper[4776]: I0128 07:07:44.170429 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:44 crc kubenswrapper[4776]: I0128 07:07:44.395699 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 28 07:07:44 crc kubenswrapper[4776]: I0128 07:07:44.863668 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z6chw" Jan 28 07:07:44 crc kubenswrapper[4776]: I0128 07:07:44.997748 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9e67cfe-459a-4e26-99d6-302c7614acac-operator-scripts\") pod \"a9e67cfe-459a-4e26-99d6-302c7614acac\" (UID: \"a9e67cfe-459a-4e26-99d6-302c7614acac\") " Jan 28 07:07:44 crc kubenswrapper[4776]: I0128 07:07:44.997823 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plx7t\" (UniqueName: \"kubernetes.io/projected/a9e67cfe-459a-4e26-99d6-302c7614acac-kube-api-access-plx7t\") pod \"a9e67cfe-459a-4e26-99d6-302c7614acac\" (UID: \"a9e67cfe-459a-4e26-99d6-302c7614acac\") " Jan 28 07:07:44 crc kubenswrapper[4776]: I0128 07:07:44.998245 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9e67cfe-459a-4e26-99d6-302c7614acac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a9e67cfe-459a-4e26-99d6-302c7614acac" (UID: "a9e67cfe-459a-4e26-99d6-302c7614acac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:45 crc kubenswrapper[4776]: I0128 07:07:45.004986 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9e67cfe-459a-4e26-99d6-302c7614acac-kube-api-access-plx7t" (OuterVolumeSpecName: "kube-api-access-plx7t") pod "a9e67cfe-459a-4e26-99d6-302c7614acac" (UID: "a9e67cfe-459a-4e26-99d6-302c7614acac"). InnerVolumeSpecName "kube-api-access-plx7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:45 crc kubenswrapper[4776]: I0128 07:07:45.099443 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9e67cfe-459a-4e26-99d6-302c7614acac-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:45 crc kubenswrapper[4776]: I0128 07:07:45.099478 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plx7t\" (UniqueName: \"kubernetes.io/projected/a9e67cfe-459a-4e26-99d6-302c7614acac-kube-api-access-plx7t\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:45 crc kubenswrapper[4776]: I0128 07:07:45.399795 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z6chw" event={"ID":"a9e67cfe-459a-4e26-99d6-302c7614acac","Type":"ContainerDied","Data":"7e5e8ecb2b74746ba3fe5734b6428c6cf1c47b1a51bcdc7156c24b9af5d2c783"} Jan 28 07:07:45 crc kubenswrapper[4776]: I0128 07:07:45.399831 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z6chw" Jan 28 07:07:45 crc kubenswrapper[4776]: I0128 07:07:45.399848 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e5e8ecb2b74746ba3fe5734b6428c6cf1c47b1a51bcdc7156c24b9af5d2c783" Jan 28 07:07:45 crc kubenswrapper[4776]: I0128 07:07:45.402151 4776 generic.go:334] "Generic (PLEG): container finished" podID="03934a8f-5036-4cf4-9dea-8f1bde73b2d7" containerID="5d4bb662b1d9ee7c5bff8bb2f4273626db7157fd8e1ebdf951d5be207f2c737e" exitCode=0 Jan 28 07:07:45 crc kubenswrapper[4776]: I0128 07:07:45.402242 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lm5q5" event={"ID":"03934a8f-5036-4cf4-9dea-8f1bde73b2d7","Type":"ContainerDied","Data":"5d4bb662b1d9ee7c5bff8bb2f4273626db7157fd8e1ebdf951d5be207f2c737e"} Jan 28 07:07:46 crc kubenswrapper[4776]: I0128 07:07:46.758582 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lm5q5" Jan 28 07:07:46 crc kubenswrapper[4776]: I0128 07:07:46.830697 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-combined-ca-bundle\") pod \"03934a8f-5036-4cf4-9dea-8f1bde73b2d7\" (UID: \"03934a8f-5036-4cf4-9dea-8f1bde73b2d7\") " Jan 28 07:07:46 crc kubenswrapper[4776]: I0128 07:07:46.830886 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99zzl\" (UniqueName: \"kubernetes.io/projected/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-kube-api-access-99zzl\") pod \"03934a8f-5036-4cf4-9dea-8f1bde73b2d7\" (UID: \"03934a8f-5036-4cf4-9dea-8f1bde73b2d7\") " Jan 28 07:07:46 crc kubenswrapper[4776]: I0128 07:07:46.831032 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-config-data\") pod \"03934a8f-5036-4cf4-9dea-8f1bde73b2d7\" (UID: \"03934a8f-5036-4cf4-9dea-8f1bde73b2d7\") " Jan 28 07:07:46 crc kubenswrapper[4776]: I0128 07:07:46.837204 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-kube-api-access-99zzl" (OuterVolumeSpecName: "kube-api-access-99zzl") pod "03934a8f-5036-4cf4-9dea-8f1bde73b2d7" (UID: "03934a8f-5036-4cf4-9dea-8f1bde73b2d7"). InnerVolumeSpecName "kube-api-access-99zzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:46 crc kubenswrapper[4776]: I0128 07:07:46.862356 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03934a8f-5036-4cf4-9dea-8f1bde73b2d7" (UID: "03934a8f-5036-4cf4-9dea-8f1bde73b2d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:07:46 crc kubenswrapper[4776]: I0128 07:07:46.881254 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-config-data" (OuterVolumeSpecName: "config-data") pod "03934a8f-5036-4cf4-9dea-8f1bde73b2d7" (UID: "03934a8f-5036-4cf4-9dea-8f1bde73b2d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:07:46 crc kubenswrapper[4776]: I0128 07:07:46.934286 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:46 crc kubenswrapper[4776]: I0128 07:07:46.934342 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99zzl\" (UniqueName: \"kubernetes.io/projected/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-kube-api-access-99zzl\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:46 crc kubenswrapper[4776]: I0128 07:07:46.934364 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03934a8f-5036-4cf4-9dea-8f1bde73b2d7-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.422039 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lm5q5" event={"ID":"03934a8f-5036-4cf4-9dea-8f1bde73b2d7","Type":"ContainerDied","Data":"eb15f55ac1446e5af1e671e8e81e1bb0739ffafdfd1252d2d01925dac513204c"} Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.422083 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb15f55ac1446e5af1e671e8e81e1bb0739ffafdfd1252d2d01925dac513204c" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.422120 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lm5q5" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.739103 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-m6h7p"] Jan 28 07:07:47 crc kubenswrapper[4776]: E0128 07:07:47.740604 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e40a7e-8ea9-4135-9ec1-3904792273aa" containerName="dnsmasq-dns" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.740637 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e40a7e-8ea9-4135-9ec1-3904792273aa" containerName="dnsmasq-dns" Jan 28 07:07:47 crc kubenswrapper[4776]: E0128 07:07:47.740651 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03934a8f-5036-4cf4-9dea-8f1bde73b2d7" containerName="keystone-db-sync" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.740660 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="03934a8f-5036-4cf4-9dea-8f1bde73b2d7" containerName="keystone-db-sync" Jan 28 07:07:47 crc kubenswrapper[4776]: E0128 07:07:47.740682 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0d5e297-fedb-45dc-a861-423f2cbe5700" containerName="mariadb-account-create-update" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.740691 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0d5e297-fedb-45dc-a861-423f2cbe5700" containerName="mariadb-account-create-update" Jan 28 07:07:47 crc kubenswrapper[4776]: E0128 07:07:47.740709 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e67cfe-459a-4e26-99d6-302c7614acac" containerName="mariadb-account-create-update" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.740717 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e67cfe-459a-4e26-99d6-302c7614acac" containerName="mariadb-account-create-update" Jan 28 07:07:47 crc kubenswrapper[4776]: E0128 07:07:47.740732 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c47ae6-fdae-4e7c-b4d3-a8faf292b443" containerName="mariadb-account-create-update" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.740740 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c47ae6-fdae-4e7c-b4d3-a8faf292b443" containerName="mariadb-account-create-update" Jan 28 07:07:47 crc kubenswrapper[4776]: E0128 07:07:47.740757 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19c08552-4143-40ef-bb55-76c8d5146a7c" containerName="mariadb-account-create-update" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.740767 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="19c08552-4143-40ef-bb55-76c8d5146a7c" containerName="mariadb-account-create-update" Jan 28 07:07:47 crc kubenswrapper[4776]: E0128 07:07:47.740781 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de153a8c-d3be-4575-bc85-4d4b08bdf05c" containerName="mariadb-database-create" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.740790 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="de153a8c-d3be-4575-bc85-4d4b08bdf05c" containerName="mariadb-database-create" Jan 28 07:07:47 crc kubenswrapper[4776]: E0128 07:07:47.740802 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d514d3-c80d-400a-adaa-2b7adf96aab8" containerName="mariadb-database-create" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.740809 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d514d3-c80d-400a-adaa-2b7adf96aab8" containerName="mariadb-database-create" Jan 28 07:07:47 crc kubenswrapper[4776]: E0128 07:07:47.740822 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e40a7e-8ea9-4135-9ec1-3904792273aa" containerName="init" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.740829 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e40a7e-8ea9-4135-9ec1-3904792273aa" containerName="init" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.741049 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="19c08552-4143-40ef-bb55-76c8d5146a7c" containerName="mariadb-account-create-update" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.741066 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="12e40a7e-8ea9-4135-9ec1-3904792273aa" containerName="dnsmasq-dns" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.741082 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2c47ae6-fdae-4e7c-b4d3-a8faf292b443" containerName="mariadb-account-create-update" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.741091 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="54d514d3-c80d-400a-adaa-2b7adf96aab8" containerName="mariadb-database-create" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.741102 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="de153a8c-d3be-4575-bc85-4d4b08bdf05c" containerName="mariadb-database-create" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.741115 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9e67cfe-459a-4e26-99d6-302c7614acac" containerName="mariadb-account-create-update" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.741128 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0d5e297-fedb-45dc-a861-423f2cbe5700" containerName="mariadb-account-create-update" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.741144 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="03934a8f-5036-4cf4-9dea-8f1bde73b2d7" containerName="keystone-db-sync" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.741842 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.744470 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.745630 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.746248 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.747401 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-52vd7" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.756681 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.764672 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-qp84q"] Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.766450 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.784252 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m6h7p"] Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.793963 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-qp84q"] Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.853528 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-fernet-keys\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.853613 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.853645 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-combined-ca-bundle\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.853676 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m8v6\" (UniqueName: \"kubernetes.io/projected/1517118c-3acb-4a6f-9e69-74ed0092345b-kube-api-access-7m8v6\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.853744 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mr7m\" (UniqueName: \"kubernetes.io/projected/082b9f0e-3a87-4a1f-82fc-08f8c195e054-kube-api-access-5mr7m\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.853770 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.853792 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-credential-keys\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.853814 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-dns-svc\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.853842 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-scripts\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.853870 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-config\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.853893 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-config-data\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.853925 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.910912 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-zxp7l"] Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.912119 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.919294 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.919602 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-2b87g" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.919824 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.933034 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f66fd5db5-sd687"] Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.934563 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.947306 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.947491 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-62q5q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.947681 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.947894 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.955020 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-fernet-keys\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.955059 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.955081 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-combined-ca-bundle\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.955102 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m8v6\" (UniqueName: \"kubernetes.io/projected/1517118c-3acb-4a6f-9e69-74ed0092345b-kube-api-access-7m8v6\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.955151 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mr7m\" (UniqueName: \"kubernetes.io/projected/082b9f0e-3a87-4a1f-82fc-08f8c195e054-kube-api-access-5mr7m\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.955168 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.955183 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-credential-keys\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.955201 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-dns-svc\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.955222 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-scripts\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.955255 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-config\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.955274 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-config-data\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.955297 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.956147 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.957518 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.958202 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.959970 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-q4gsk"] Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.961762 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-config\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.962434 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-dns-svc\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.962995 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q4gsk" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.973628 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.973902 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.974250 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-5wwr9" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.982788 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-config-data\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.983196 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-fernet-keys\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.984591 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-credential-keys\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.985352 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-combined-ca-bundle\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:47 crc kubenswrapper[4776]: I0128 07:07:47.998524 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zxp7l"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.000142 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mr7m\" (UniqueName: \"kubernetes.io/projected/082b9f0e-3a87-4a1f-82fc-08f8c195e054-kube-api-access-5mr7m\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.004287 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-scripts\") pod \"keystone-bootstrap-m6h7p\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.004357 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f66fd5db5-sd687"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.026316 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m8v6\" (UniqueName: \"kubernetes.io/projected/1517118c-3acb-4a6f-9e69-74ed0092345b-kube-api-access-7m8v6\") pod \"dnsmasq-dns-847c4cc679-qp84q\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.028391 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-q4gsk"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.056628 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd2e30fd-49a7-4182-8e64-72c01a2394d4-config\") pod \"neutron-db-sync-q4gsk\" (UID: \"dd2e30fd-49a7-4182-8e64-72c01a2394d4\") " pod="openstack/neutron-db-sync-q4gsk" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.056682 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c32e828-bea4-4a05-9492-31124e2964e1-logs\") pod \"horizon-f66fd5db5-sd687\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.056746 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-config-data\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.056772 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtst6\" (UniqueName: \"kubernetes.io/projected/2c32e828-bea4-4a05-9492-31124e2964e1-kube-api-access-mtst6\") pod \"horizon-f66fd5db5-sd687\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.056795 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1400af-1c32-4f74-89f8-30b42dbb6c91-etc-machine-id\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.056824 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c32e828-bea4-4a05-9492-31124e2964e1-horizon-secret-key\") pod \"horizon-f66fd5db5-sd687\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.056841 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2e30fd-49a7-4182-8e64-72c01a2394d4-combined-ca-bundle\") pod \"neutron-db-sync-q4gsk\" (UID: \"dd2e30fd-49a7-4182-8e64-72c01a2394d4\") " pod="openstack/neutron-db-sync-q4gsk" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.056861 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c32e828-bea4-4a05-9492-31124e2964e1-config-data\") pod \"horizon-f66fd5db5-sd687\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.056907 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4frmg\" (UniqueName: \"kubernetes.io/projected/2c1400af-1c32-4f74-89f8-30b42dbb6c91-kube-api-access-4frmg\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.056936 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-db-sync-config-data\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.056966 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbrf9\" (UniqueName: \"kubernetes.io/projected/dd2e30fd-49a7-4182-8e64-72c01a2394d4-kube-api-access-mbrf9\") pod \"neutron-db-sync-q4gsk\" (UID: \"dd2e30fd-49a7-4182-8e64-72c01a2394d4\") " pod="openstack/neutron-db-sync-q4gsk" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.057009 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c32e828-bea4-4a05-9492-31124e2964e1-scripts\") pod \"horizon-f66fd5db5-sd687\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.057068 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-combined-ca-bundle\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.057099 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-scripts\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.068702 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.069166 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.090388 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.094007 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.094248 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.094668 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.094736 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.123107 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-c8whn"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.127110 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-c8whn" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.159699 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.159832 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4frmg\" (UniqueName: \"kubernetes.io/projected/2c1400af-1c32-4f74-89f8-30b42dbb6c91-kube-api-access-4frmg\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.159874 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.160005 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-b9w9l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.159872 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-db-sync-config-data\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.160484 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbrf9\" (UniqueName: \"kubernetes.io/projected/dd2e30fd-49a7-4182-8e64-72c01a2394d4-kube-api-access-mbrf9\") pod \"neutron-db-sync-q4gsk\" (UID: \"dd2e30fd-49a7-4182-8e64-72c01a2394d4\") " pod="openstack/neutron-db-sync-q4gsk" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.160560 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c32e828-bea4-4a05-9492-31124e2964e1-scripts\") pod \"horizon-f66fd5db5-sd687\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.160586 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.160614 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-run-httpd\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.160652 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-config-data\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.160699 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-combined-ca-bundle\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.160731 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-scripts\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.160759 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-scripts\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.160777 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd2e30fd-49a7-4182-8e64-72c01a2394d4-config\") pod \"neutron-db-sync-q4gsk\" (UID: \"dd2e30fd-49a7-4182-8e64-72c01a2394d4\") " pod="openstack/neutron-db-sync-q4gsk" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.160807 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c32e828-bea4-4a05-9492-31124e2964e1-logs\") pod \"horizon-f66fd5db5-sd687\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.160866 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p4kg\" (UniqueName: \"kubernetes.io/projected/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-kube-api-access-6p4kg\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.160894 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-log-httpd\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.160945 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-config-data\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.160981 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtst6\" (UniqueName: \"kubernetes.io/projected/2c32e828-bea4-4a05-9492-31124e2964e1-kube-api-access-mtst6\") pod \"horizon-f66fd5db5-sd687\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.161005 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1400af-1c32-4f74-89f8-30b42dbb6c91-etc-machine-id\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.161021 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.161054 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c32e828-bea4-4a05-9492-31124e2964e1-horizon-secret-key\") pod \"horizon-f66fd5db5-sd687\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.161069 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2e30fd-49a7-4182-8e64-72c01a2394d4-combined-ca-bundle\") pod \"neutron-db-sync-q4gsk\" (UID: \"dd2e30fd-49a7-4182-8e64-72c01a2394d4\") " pod="openstack/neutron-db-sync-q4gsk" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.161086 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c32e828-bea4-4a05-9492-31124e2964e1-config-data\") pod \"horizon-f66fd5db5-sd687\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.162473 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c32e828-bea4-4a05-9492-31124e2964e1-config-data\") pod \"horizon-f66fd5db5-sd687\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.221626 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c32e828-bea4-4a05-9492-31124e2964e1-scripts\") pod \"horizon-f66fd5db5-sd687\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.222892 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbrf9\" (UniqueName: \"kubernetes.io/projected/dd2e30fd-49a7-4182-8e64-72c01a2394d4-kube-api-access-mbrf9\") pod \"neutron-db-sync-q4gsk\" (UID: \"dd2e30fd-49a7-4182-8e64-72c01a2394d4\") " pod="openstack/neutron-db-sync-q4gsk" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.229381 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c32e828-bea4-4a05-9492-31124e2964e1-logs\") pod \"horizon-f66fd5db5-sd687\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.260429 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-db-sync-config-data\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.260449 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c32e828-bea4-4a05-9492-31124e2964e1-horizon-secret-key\") pod \"horizon-f66fd5db5-sd687\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.271055 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1400af-1c32-4f74-89f8-30b42dbb6c91-etc-machine-id\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.297564 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-c8whn"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.306968 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd2e30fd-49a7-4182-8e64-72c01a2394d4-config\") pod \"neutron-db-sync-q4gsk\" (UID: \"dd2e30fd-49a7-4182-8e64-72c01a2394d4\") " pod="openstack/neutron-db-sync-q4gsk" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.307059 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-config-data\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.308050 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4frmg\" (UniqueName: \"kubernetes.io/projected/2c1400af-1c32-4f74-89f8-30b42dbb6c91-kube-api-access-4frmg\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.308609 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-combined-ca-bundle\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.308623 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtst6\" (UniqueName: \"kubernetes.io/projected/2c32e828-bea4-4a05-9492-31124e2964e1-kube-api-access-mtst6\") pod \"horizon-f66fd5db5-sd687\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.309997 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2e30fd-49a7-4182-8e64-72c01a2394d4-combined-ca-bundle\") pod \"neutron-db-sync-q4gsk\" (UID: \"dd2e30fd-49a7-4182-8e64-72c01a2394d4\") " pod="openstack/neutron-db-sync-q4gsk" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.357820 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-scripts\") pod \"cinder-db-sync-zxp7l\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.361161 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p4kg\" (UniqueName: \"kubernetes.io/projected/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-kube-api-access-6p4kg\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.361217 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-logs\") pod \"placement-db-sync-c8whn\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " pod="openstack/placement-db-sync-c8whn" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.361236 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-log-httpd\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.361289 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.361307 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmmfj\" (UniqueName: \"kubernetes.io/projected/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-kube-api-access-tmmfj\") pod \"placement-db-sync-c8whn\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " pod="openstack/placement-db-sync-c8whn" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.361382 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-config-data\") pod \"placement-db-sync-c8whn\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " pod="openstack/placement-db-sync-c8whn" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.361401 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-combined-ca-bundle\") pod \"placement-db-sync-c8whn\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " pod="openstack/placement-db-sync-c8whn" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.361494 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.361510 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-scripts\") pod \"placement-db-sync-c8whn\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " pod="openstack/placement-db-sync-c8whn" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.361528 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-run-httpd\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.361571 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-config-data\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.361623 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-scripts\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.367218 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-log-httpd\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.367496 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-run-httpd\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.371828 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-qp84q"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.375392 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.376903 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-scripts\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.383534 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-config-data\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.390919 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.398646 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p4kg\" (UniqueName: \"kubernetes.io/projected/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-kube-api-access-6p4kg\") pod \"ceilometer-0\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.414635 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-fg2nt"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.416298 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fg2nt" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.419957 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-44k68" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.420190 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.427361 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fg2nt"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.455954 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5bd69c9-nkv6n"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.458598 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.463798 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-logs\") pod \"placement-db-sync-c8whn\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " pod="openstack/placement-db-sync-c8whn" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.463839 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmmfj\" (UniqueName: \"kubernetes.io/projected/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-kube-api-access-tmmfj\") pod \"placement-db-sync-c8whn\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " pod="openstack/placement-db-sync-c8whn" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.463872 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-config-data\") pod \"placement-db-sync-c8whn\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " pod="openstack/placement-db-sync-c8whn" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.463889 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-combined-ca-bundle\") pod \"placement-db-sync-c8whn\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " pod="openstack/placement-db-sync-c8whn" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.463931 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-scripts\") pod \"placement-db-sync-c8whn\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " pod="openstack/placement-db-sync-c8whn" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.465458 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-logs\") pod \"placement-db-sync-c8whn\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " pod="openstack/placement-db-sync-c8whn" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.472771 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-combined-ca-bundle\") pod \"placement-db-sync-c8whn\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " pod="openstack/placement-db-sync-c8whn" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.472943 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-config-data\") pod \"placement-db-sync-c8whn\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " pod="openstack/placement-db-sync-c8whn" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.489889 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-w4jz5"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.493150 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-scripts\") pod \"placement-db-sync-c8whn\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " pod="openstack/placement-db-sync-c8whn" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.495148 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.500618 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmmfj\" (UniqueName: \"kubernetes.io/projected/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-kube-api-access-tmmfj\") pod \"placement-db-sync-c8whn\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " pod="openstack/placement-db-sync-c8whn" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.516589 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.529747 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-w4jz5"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.536742 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.538986 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5bd69c9-nkv6n"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.554561 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.556617 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.565845 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.566066 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/552e1ac3-1b3f-4480-820b-ee07c76efd64-config-data\") pod \"horizon-5bd69c9-nkv6n\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.566082 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.566134 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6377f80e-0b32-479e-b33c-fc4d9f67b299-combined-ca-bundle\") pod \"barbican-db-sync-fg2nt\" (UID: \"6377f80e-0b32-479e-b33c-fc4d9f67b299\") " pod="openstack/barbican-db-sync-fg2nt" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.566189 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6377f80e-0b32-479e-b33c-fc4d9f67b299-db-sync-config-data\") pod \"barbican-db-sync-fg2nt\" (UID: \"6377f80e-0b32-479e-b33c-fc4d9f67b299\") " pod="openstack/barbican-db-sync-fg2nt" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.566228 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68gbn\" (UniqueName: \"kubernetes.io/projected/552e1ac3-1b3f-4480-820b-ee07c76efd64-kube-api-access-68gbn\") pod \"horizon-5bd69c9-nkv6n\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.566269 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/552e1ac3-1b3f-4480-820b-ee07c76efd64-logs\") pod \"horizon-5bd69c9-nkv6n\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.566307 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/552e1ac3-1b3f-4480-820b-ee07c76efd64-scripts\") pod \"horizon-5bd69c9-nkv6n\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.566322 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jnkb\" (UniqueName: \"kubernetes.io/projected/6377f80e-0b32-479e-b33c-fc4d9f67b299-kube-api-access-4jnkb\") pod \"barbican-db-sync-fg2nt\" (UID: \"6377f80e-0b32-479e-b33c-fc4d9f67b299\") " pod="openstack/barbican-db-sync-fg2nt" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.566346 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/552e1ac3-1b3f-4480-820b-ee07c76efd64-horizon-secret-key\") pod \"horizon-5bd69c9-nkv6n\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.566374 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.566588 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-88s8h" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.569058 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.572058 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q4gsk" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.583988 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.585867 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.590378 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.595040 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.619040 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.669387 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671043 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/552e1ac3-1b3f-4480-820b-ee07c76efd64-config-data\") pod \"horizon-5bd69c9-nkv6n\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671070 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671097 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cea16506-5466-4c4b-962f-a4cf2d233c93-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671120 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6377f80e-0b32-479e-b33c-fc4d9f67b299-combined-ca-bundle\") pod \"barbican-db-sync-fg2nt\" (UID: \"6377f80e-0b32-479e-b33c-fc4d9f67b299\") " pod="openstack/barbican-db-sync-fg2nt" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671142 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671182 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6377f80e-0b32-479e-b33c-fc4d9f67b299-db-sync-config-data\") pod \"barbican-db-sync-fg2nt\" (UID: \"6377f80e-0b32-479e-b33c-fc4d9f67b299\") " pod="openstack/barbican-db-sync-fg2nt" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671207 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea16506-5466-4c4b-962f-a4cf2d233c93-logs\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671221 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-config-data\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671238 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68gbn\" (UniqueName: \"kubernetes.io/projected/552e1ac3-1b3f-4480-820b-ee07c76efd64-kube-api-access-68gbn\") pod \"horizon-5bd69c9-nkv6n\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671260 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671287 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-scripts\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671302 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671319 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-config\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671344 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/552e1ac3-1b3f-4480-820b-ee07c76efd64-logs\") pod \"horizon-5bd69c9-nkv6n\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671365 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671383 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671401 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgvlp\" (UniqueName: \"kubernetes.io/projected/cea16506-5466-4c4b-962f-a4cf2d233c93-kube-api-access-pgvlp\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671421 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wbmj\" (UniqueName: \"kubernetes.io/projected/d2d0e1a6-28c0-4e77-8066-38169bc1d083-kube-api-access-2wbmj\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671438 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/552e1ac3-1b3f-4480-820b-ee07c76efd64-scripts\") pod \"horizon-5bd69c9-nkv6n\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671455 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jnkb\" (UniqueName: \"kubernetes.io/projected/6377f80e-0b32-479e-b33c-fc4d9f67b299-kube-api-access-4jnkb\") pod \"barbican-db-sync-fg2nt\" (UID: \"6377f80e-0b32-479e-b33c-fc4d9f67b299\") " pod="openstack/barbican-db-sync-fg2nt" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671476 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/552e1ac3-1b3f-4480-820b-ee07c76efd64-horizon-secret-key\") pod \"horizon-5bd69c9-nkv6n\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.671507 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.672786 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/552e1ac3-1b3f-4480-820b-ee07c76efd64-config-data\") pod \"horizon-5bd69c9-nkv6n\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.687645 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/552e1ac3-1b3f-4480-820b-ee07c76efd64-logs\") pod \"horizon-5bd69c9-nkv6n\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.688145 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/552e1ac3-1b3f-4480-820b-ee07c76efd64-scripts\") pod \"horizon-5bd69c9-nkv6n\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.690107 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-c8whn" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.698200 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6377f80e-0b32-479e-b33c-fc4d9f67b299-db-sync-config-data\") pod \"barbican-db-sync-fg2nt\" (UID: \"6377f80e-0b32-479e-b33c-fc4d9f67b299\") " pod="openstack/barbican-db-sync-fg2nt" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.717470 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/552e1ac3-1b3f-4480-820b-ee07c76efd64-horizon-secret-key\") pod \"horizon-5bd69c9-nkv6n\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.719535 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6377f80e-0b32-479e-b33c-fc4d9f67b299-combined-ca-bundle\") pod \"barbican-db-sync-fg2nt\" (UID: \"6377f80e-0b32-479e-b33c-fc4d9f67b299\") " pod="openstack/barbican-db-sync-fg2nt" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.744097 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68gbn\" (UniqueName: \"kubernetes.io/projected/552e1ac3-1b3f-4480-820b-ee07c76efd64-kube-api-access-68gbn\") pod \"horizon-5bd69c9-nkv6n\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.754485 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jnkb\" (UniqueName: \"kubernetes.io/projected/6377f80e-0b32-479e-b33c-fc4d9f67b299-kube-api-access-4jnkb\") pod \"barbican-db-sync-fg2nt\" (UID: \"6377f80e-0b32-479e-b33c-fc4d9f67b299\") " pod="openstack/barbican-db-sync-fg2nt" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.781712 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.781762 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-logs\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.781787 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cea16506-5466-4c4b-962f-a4cf2d233c93-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.781808 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.781831 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.781883 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea16506-5466-4c4b-962f-a4cf2d233c93-logs\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.781904 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-config-data\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.781927 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.781943 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-scripts\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.781957 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.781974 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.781991 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-config\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.782021 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.782041 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.782059 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgvlp\" (UniqueName: \"kubernetes.io/projected/cea16506-5466-4c4b-962f-a4cf2d233c93-kube-api-access-pgvlp\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.782082 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wbmj\" (UniqueName: \"kubernetes.io/projected/d2d0e1a6-28c0-4e77-8066-38169bc1d083-kube-api-access-2wbmj\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.782101 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.782124 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.782140 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.782164 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.782187 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.782205 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhcgj\" (UniqueName: \"kubernetes.io/projected/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-kube-api-access-nhcgj\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.783036 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.783301 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cea16506-5466-4c4b-962f-a4cf2d233c93-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.783798 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.783988 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea16506-5466-4c4b-962f-a4cf2d233c93-logs\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.787342 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.787644 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.794605 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-config\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.795377 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.795893 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.802793 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-config-data\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.809662 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.831183 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.842259 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wbmj\" (UniqueName: \"kubernetes.io/projected/d2d0e1a6-28c0-4e77-8066-38169bc1d083-kube-api-access-2wbmj\") pod \"dnsmasq-dns-785d8bcb8c-w4jz5\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.849959 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgvlp\" (UniqueName: \"kubernetes.io/projected/cea16506-5466-4c4b-962f-a4cf2d233c93-kube-api-access-pgvlp\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.854244 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-scripts\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.875498 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.884291 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-logs\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.884332 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.884405 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.884452 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.884473 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.884488 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.884512 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.884534 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhcgj\" (UniqueName: \"kubernetes.io/projected/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-kube-api-access-nhcgj\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.885206 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-logs\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.886813 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.888300 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.899442 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.901134 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.902684 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.906023 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.912604 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.925674 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhcgj\" (UniqueName: \"kubernetes.io/projected/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-kube-api-access-nhcgj\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.952600 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-qp84q"] Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.977732 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:07:48 crc kubenswrapper[4776]: I0128 07:07:48.994020 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m6h7p"] Jan 28 07:07:49 crc kubenswrapper[4776]: I0128 07:07:49.045572 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fg2nt" Jan 28 07:07:49 crc kubenswrapper[4776]: I0128 07:07:49.126862 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:49 crc kubenswrapper[4776]: I0128 07:07:49.242348 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 07:07:49 crc kubenswrapper[4776]: I0128 07:07:49.466911 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m6h7p" event={"ID":"082b9f0e-3a87-4a1f-82fc-08f8c195e054","Type":"ContainerStarted","Data":"291bb0f3b2de5177787f79b3112808fcc747ce894505dee47b6990c44a18f6d5"} Jan 28 07:07:49 crc kubenswrapper[4776]: I0128 07:07:49.478265 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-qp84q" event={"ID":"1517118c-3acb-4a6f-9e69-74ed0092345b","Type":"ContainerStarted","Data":"f4886fccd8a55a26cb97b983a0ad932e04c18a0dd3aa3017ecbfcb534df6adbe"} Jan 28 07:07:49 crc kubenswrapper[4776]: I0128 07:07:49.684754 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zxp7l"] Jan 28 07:07:49 crc kubenswrapper[4776]: W0128 07:07:49.691131 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c1400af_1c32_4f74_89f8_30b42dbb6c91.slice/crio-948618003b95dbfc8368e471ce399b796f03578a5347653e9228ef78f02d2ce1 WatchSource:0}: Error finding container 948618003b95dbfc8368e471ce399b796f03578a5347653e9228ef78f02d2ce1: Status 404 returned error can't find the container with id 948618003b95dbfc8368e471ce399b796f03578a5347653e9228ef78f02d2ce1 Jan 28 07:07:49 crc kubenswrapper[4776]: I0128 07:07:49.710043 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f66fd5db5-sd687"] Jan 28 07:07:49 crc kubenswrapper[4776]: W0128 07:07:49.720138 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c32e828_bea4_4a05_9492_31124e2964e1.slice/crio-2e2e2fc23192afb8d74afd5af165273c5e62e2fa3abf365ed5a96a89edb65c39 WatchSource:0}: Error finding container 2e2e2fc23192afb8d74afd5af165273c5e62e2fa3abf365ed5a96a89edb65c39: Status 404 returned error can't find the container with id 2e2e2fc23192afb8d74afd5af165273c5e62e2fa3abf365ed5a96a89edb65c39 Jan 28 07:07:49 crc kubenswrapper[4776]: I0128 07:07:49.722987 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-q4gsk"] Jan 28 07:07:49 crc kubenswrapper[4776]: W0128 07:07:49.732593 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod801e91e9_f527_41e3_9468_9ec9e9ec8f3c.slice/crio-82dbdba568082e1d4c8a50ab75ff9b3e5d2a65251163f2e7c2e32306fb162f75 WatchSource:0}: Error finding container 82dbdba568082e1d4c8a50ab75ff9b3e5d2a65251163f2e7c2e32306fb162f75: Status 404 returned error can't find the container with id 82dbdba568082e1d4c8a50ab75ff9b3e5d2a65251163f2e7c2e32306fb162f75 Jan 28 07:07:49 crc kubenswrapper[4776]: W0128 07:07:49.735519 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd2e30fd_49a7_4182_8e64_72c01a2394d4.slice/crio-07b88a56f1665edfe9b8785381c4527bfb45888039083d542003b3a44e9a7cdf WatchSource:0}: Error finding container 07b88a56f1665edfe9b8785381c4527bfb45888039083d542003b3a44e9a7cdf: Status 404 returned error can't find the container with id 07b88a56f1665edfe9b8785381c4527bfb45888039083d542003b3a44e9a7cdf Jan 28 07:07:49 crc kubenswrapper[4776]: I0128 07:07:49.763684 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:07:49 crc kubenswrapper[4776]: I0128 07:07:49.786069 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5bd69c9-nkv6n"] Jan 28 07:07:49 crc kubenswrapper[4776]: I0128 07:07:49.795280 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-c8whn"] Jan 28 07:07:49 crc kubenswrapper[4776]: I0128 07:07:49.901103 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fg2nt"] Jan 28 07:07:49 crc kubenswrapper[4776]: I0128 07:07:49.990896 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 07:07:50 crc kubenswrapper[4776]: W0128 07:07:50.011158 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcea16506_5466_4c4b_962f_a4cf2d233c93.slice/crio-b31968c8a68fb7a505ce5dd788092e3c74df9877454bf80aaad836a7d136bada WatchSource:0}: Error finding container b31968c8a68fb7a505ce5dd788092e3c74df9877454bf80aaad836a7d136bada: Status 404 returned error can't find the container with id b31968c8a68fb7a505ce5dd788092e3c74df9877454bf80aaad836a7d136bada Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.025239 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-w4jz5"] Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.126393 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 07:07:50 crc kubenswrapper[4776]: W0128 07:07:50.145951 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc8a69eb_91af_4eab_8fdb_ac4ffba53b9b.slice/crio-826426b902de0ddd0efe938d539c9561d150c3febd0756bd34141352fa07ec87 WatchSource:0}: Error finding container 826426b902de0ddd0efe938d539c9561d150c3febd0756bd34141352fa07ec87: Status 404 returned error can't find the container with id 826426b902de0ddd0efe938d539c9561d150c3febd0756bd34141352fa07ec87 Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.315658 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.349695 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f66fd5db5-sd687"] Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.397745 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7fb9dfd4f7-bpfgk"] Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.399839 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.420685 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7fb9dfd4f7-bpfgk"] Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.452510 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.457485 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.510160 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f66fd5db5-sd687" event={"ID":"2c32e828-bea4-4a05-9492-31124e2964e1","Type":"ContainerStarted","Data":"2e2e2fc23192afb8d74afd5af165273c5e62e2fa3abf365ed5a96a89edb65c39"} Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.512596 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-c8whn" event={"ID":"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03","Type":"ContainerStarted","Data":"302ba9a3012f6b9b76df2cf4b399c0405f6f5050f77a060c1ffa4f4b86712c27"} Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.518434 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b","Type":"ContainerStarted","Data":"826426b902de0ddd0efe938d539c9561d150c3febd0756bd34141352fa07ec87"} Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.523005 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bd69c9-nkv6n" event={"ID":"552e1ac3-1b3f-4480-820b-ee07c76efd64","Type":"ContainerStarted","Data":"02c8cb29920584899d1dd7d0761e823bfba2dd5e326d26f1fb6a5c2acfb1de97"} Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.526070 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cea16506-5466-4c4b-962f-a4cf2d233c93","Type":"ContainerStarted","Data":"b31968c8a68fb7a505ce5dd788092e3c74df9877454bf80aaad836a7d136bada"} Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.527989 4776 generic.go:334] "Generic (PLEG): container finished" podID="1517118c-3acb-4a6f-9e69-74ed0092345b" containerID="22c490bf544d890e334874795f2c838ef05a0c71bfc71f8413414df053c2c53f" exitCode=0 Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.528037 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-qp84q" event={"ID":"1517118c-3acb-4a6f-9e69-74ed0092345b","Type":"ContainerDied","Data":"22c490bf544d890e334874795f2c838ef05a0c71bfc71f8413414df053c2c53f"} Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.532105 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fg2nt" event={"ID":"6377f80e-0b32-479e-b33c-fc4d9f67b299","Type":"ContainerStarted","Data":"c5e09565803f99d513ec3b1bec6643c2ed0b2895e20fa36714ce5b30ddf905ca"} Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.544092 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" event={"ID":"d2d0e1a6-28c0-4e77-8066-38169bc1d083","Type":"ContainerStarted","Data":"5780caa695f64edd9d5bc07f68b8c80ea54493aceba9cf20630861762f535eb5"} Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.548043 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zxp7l" event={"ID":"2c1400af-1c32-4f74-89f8-30b42dbb6c91","Type":"ContainerStarted","Data":"948618003b95dbfc8368e471ce399b796f03578a5347653e9228ef78f02d2ce1"} Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.558114 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3acb133b-df04-4bed-bf37-fe0c45c08dc7-config-data\") pod \"horizon-7fb9dfd4f7-bpfgk\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.558210 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3acb133b-df04-4bed-bf37-fe0c45c08dc7-scripts\") pod \"horizon-7fb9dfd4f7-bpfgk\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.558356 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5f4m\" (UniqueName: \"kubernetes.io/projected/3acb133b-df04-4bed-bf37-fe0c45c08dc7-kube-api-access-q5f4m\") pod \"horizon-7fb9dfd4f7-bpfgk\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.558393 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3acb133b-df04-4bed-bf37-fe0c45c08dc7-horizon-secret-key\") pod \"horizon-7fb9dfd4f7-bpfgk\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.558430 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3acb133b-df04-4bed-bf37-fe0c45c08dc7-logs\") pod \"horizon-7fb9dfd4f7-bpfgk\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.567789 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m6h7p" event={"ID":"082b9f0e-3a87-4a1f-82fc-08f8c195e054","Type":"ContainerStarted","Data":"251ba1f110367388593330d151d1c88c4d914c55af6a2fd2b3c04e541b32ef84"} Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.579667 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q4gsk" event={"ID":"dd2e30fd-49a7-4182-8e64-72c01a2394d4","Type":"ContainerStarted","Data":"e3e40e8e3d347ea5c8c72bdefb22ddda79422d389b6094b60f9668c9128dfb32"} Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.579720 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q4gsk" event={"ID":"dd2e30fd-49a7-4182-8e64-72c01a2394d4","Type":"ContainerStarted","Data":"07b88a56f1665edfe9b8785381c4527bfb45888039083d542003b3a44e9a7cdf"} Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.587078 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801e91e9-f527-41e3-9468-9ec9e9ec8f3c","Type":"ContainerStarted","Data":"82dbdba568082e1d4c8a50ab75ff9b3e5d2a65251163f2e7c2e32306fb162f75"} Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.645460 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-m6h7p" podStartSLOduration=3.645441124 podStartE2EDuration="3.645441124s" podCreationTimestamp="2026-01-28 07:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:07:50.591846798 +0000 UTC m=+1042.007506968" watchObservedRunningTime="2026-01-28 07:07:50.645441124 +0000 UTC m=+1042.061101274" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.662695 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5f4m\" (UniqueName: \"kubernetes.io/projected/3acb133b-df04-4bed-bf37-fe0c45c08dc7-kube-api-access-q5f4m\") pod \"horizon-7fb9dfd4f7-bpfgk\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.662753 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3acb133b-df04-4bed-bf37-fe0c45c08dc7-horizon-secret-key\") pod \"horizon-7fb9dfd4f7-bpfgk\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.662800 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3acb133b-df04-4bed-bf37-fe0c45c08dc7-logs\") pod \"horizon-7fb9dfd4f7-bpfgk\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.662989 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3acb133b-df04-4bed-bf37-fe0c45c08dc7-config-data\") pod \"horizon-7fb9dfd4f7-bpfgk\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.663084 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3acb133b-df04-4bed-bf37-fe0c45c08dc7-scripts\") pod \"horizon-7fb9dfd4f7-bpfgk\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.667493 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-q4gsk" podStartSLOduration=3.667481963 podStartE2EDuration="3.667481963s" podCreationTimestamp="2026-01-28 07:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:07:50.626541421 +0000 UTC m=+1042.042201581" watchObservedRunningTime="2026-01-28 07:07:50.667481963 +0000 UTC m=+1042.083142123" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.671398 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3acb133b-df04-4bed-bf37-fe0c45c08dc7-horizon-secret-key\") pod \"horizon-7fb9dfd4f7-bpfgk\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.673931 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3acb133b-df04-4bed-bf37-fe0c45c08dc7-logs\") pod \"horizon-7fb9dfd4f7-bpfgk\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.674443 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3acb133b-df04-4bed-bf37-fe0c45c08dc7-scripts\") pod \"horizon-7fb9dfd4f7-bpfgk\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.677931 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3acb133b-df04-4bed-bf37-fe0c45c08dc7-config-data\") pod \"horizon-7fb9dfd4f7-bpfgk\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.694275 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5f4m\" (UniqueName: \"kubernetes.io/projected/3acb133b-df04-4bed-bf37-fe0c45c08dc7-kube-api-access-q5f4m\") pod \"horizon-7fb9dfd4f7-bpfgk\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.726339 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:07:50 crc kubenswrapper[4776]: I0128 07:07:50.957629 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.085341 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-dns-swift-storage-0\") pod \"1517118c-3acb-4a6f-9e69-74ed0092345b\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.087290 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m8v6\" (UniqueName: \"kubernetes.io/projected/1517118c-3acb-4a6f-9e69-74ed0092345b-kube-api-access-7m8v6\") pod \"1517118c-3acb-4a6f-9e69-74ed0092345b\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.087339 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-ovsdbserver-sb\") pod \"1517118c-3acb-4a6f-9e69-74ed0092345b\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.087405 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-dns-svc\") pod \"1517118c-3acb-4a6f-9e69-74ed0092345b\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.087419 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-config\") pod \"1517118c-3acb-4a6f-9e69-74ed0092345b\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.087557 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-ovsdbserver-nb\") pod \"1517118c-3acb-4a6f-9e69-74ed0092345b\" (UID: \"1517118c-3acb-4a6f-9e69-74ed0092345b\") " Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.134844 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1517118c-3acb-4a6f-9e69-74ed0092345b-kube-api-access-7m8v6" (OuterVolumeSpecName: "kube-api-access-7m8v6") pod "1517118c-3acb-4a6f-9e69-74ed0092345b" (UID: "1517118c-3acb-4a6f-9e69-74ed0092345b"). InnerVolumeSpecName "kube-api-access-7m8v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.191908 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m8v6\" (UniqueName: \"kubernetes.io/projected/1517118c-3acb-4a6f-9e69-74ed0092345b-kube-api-access-7m8v6\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.234649 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1517118c-3acb-4a6f-9e69-74ed0092345b" (UID: "1517118c-3acb-4a6f-9e69-74ed0092345b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.267276 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1517118c-3acb-4a6f-9e69-74ed0092345b" (UID: "1517118c-3acb-4a6f-9e69-74ed0092345b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.284280 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1517118c-3acb-4a6f-9e69-74ed0092345b" (UID: "1517118c-3acb-4a6f-9e69-74ed0092345b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.295218 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.295242 4776 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.295250 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.298246 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-config" (OuterVolumeSpecName: "config") pod "1517118c-3acb-4a6f-9e69-74ed0092345b" (UID: "1517118c-3acb-4a6f-9e69-74ed0092345b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.330568 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1517118c-3acb-4a6f-9e69-74ed0092345b" (UID: "1517118c-3acb-4a6f-9e69-74ed0092345b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.402175 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.402201 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1517118c-3acb-4a6f-9e69-74ed0092345b-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.417680 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7fb9dfd4f7-bpfgk"] Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.610662 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-qp84q" event={"ID":"1517118c-3acb-4a6f-9e69-74ed0092345b","Type":"ContainerDied","Data":"f4886fccd8a55a26cb97b983a0ad932e04c18a0dd3aa3017ecbfcb534df6adbe"} Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.610710 4776 scope.go:117] "RemoveContainer" containerID="22c490bf544d890e334874795f2c838ef05a0c71bfc71f8413414df053c2c53f" Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.610730 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-qp84q" Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.616256 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cea16506-5466-4c4b-962f-a4cf2d233c93","Type":"ContainerStarted","Data":"a9c5000a28f314609eadfc4c02e1cc1c08f184777067e28b1ad5b5e2b1d2b750"} Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.624229 4776 generic.go:334] "Generic (PLEG): container finished" podID="d2d0e1a6-28c0-4e77-8066-38169bc1d083" containerID="eb8f1401a023c1ab82f1f67d2dc41bfdaca095988575390a18f225c0c94f6c06" exitCode=0 Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.624289 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" event={"ID":"d2d0e1a6-28c0-4e77-8066-38169bc1d083","Type":"ContainerDied","Data":"eb8f1401a023c1ab82f1f67d2dc41bfdaca095988575390a18f225c0c94f6c06"} Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.624322 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" event={"ID":"d2d0e1a6-28c0-4e77-8066-38169bc1d083","Type":"ContainerStarted","Data":"273dfe6a30465394f90b44084307024a84926067e74f4d612c2728375327b23a"} Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.624459 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.634558 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb9dfd4f7-bpfgk" event={"ID":"3acb133b-df04-4bed-bf37-fe0c45c08dc7","Type":"ContainerStarted","Data":"a8146ea734b788d1429072a88582845b564a95e2c4192aa46ec6fae49696ec64"} Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.646267 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b","Type":"ContainerStarted","Data":"9a0429a49f30cfbbb0fc0eb60c73fc15c585b0a18ffca045b308134e7963123f"} Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.697346 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-qp84q"] Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.710886 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-qp84q"] Jan 28 07:07:51 crc kubenswrapper[4776]: I0128 07:07:51.713704 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" podStartSLOduration=3.713686718 podStartE2EDuration="3.713686718s" podCreationTimestamp="2026-01-28 07:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:07:51.706992466 +0000 UTC m=+1043.122652626" watchObservedRunningTime="2026-01-28 07:07:51.713686718 +0000 UTC m=+1043.129346878" Jan 28 07:07:52 crc kubenswrapper[4776]: I0128 07:07:52.665530 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cea16506-5466-4c4b-962f-a4cf2d233c93","Type":"ContainerStarted","Data":"45221c24d2c63a782c9a8645d56674fbeea6852cbaa28c94f4755bbd5e9aa5c5"} Jan 28 07:07:52 crc kubenswrapper[4776]: I0128 07:07:52.666086 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cea16506-5466-4c4b-962f-a4cf2d233c93" containerName="glance-log" containerID="cri-o://a9c5000a28f314609eadfc4c02e1cc1c08f184777067e28b1ad5b5e2b1d2b750" gracePeriod=30 Jan 28 07:07:52 crc kubenswrapper[4776]: I0128 07:07:52.666128 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cea16506-5466-4c4b-962f-a4cf2d233c93" containerName="glance-httpd" containerID="cri-o://45221c24d2c63a782c9a8645d56674fbeea6852cbaa28c94f4755bbd5e9aa5c5" gracePeriod=30 Jan 28 07:07:52 crc kubenswrapper[4776]: I0128 07:07:52.693852 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.693835389 podStartE2EDuration="4.693835389s" podCreationTimestamp="2026-01-28 07:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:07:52.691397952 +0000 UTC m=+1044.107058112" watchObservedRunningTime="2026-01-28 07:07:52.693835389 +0000 UTC m=+1044.109495549" Jan 28 07:07:53 crc kubenswrapper[4776]: I0128 07:07:53.314410 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1517118c-3acb-4a6f-9e69-74ed0092345b" path="/var/lib/kubelet/pods/1517118c-3acb-4a6f-9e69-74ed0092345b/volumes" Jan 28 07:07:53 crc kubenswrapper[4776]: I0128 07:07:53.695626 4776 generic.go:334] "Generic (PLEG): container finished" podID="cea16506-5466-4c4b-962f-a4cf2d233c93" containerID="45221c24d2c63a782c9a8645d56674fbeea6852cbaa28c94f4755bbd5e9aa5c5" exitCode=0 Jan 28 07:07:53 crc kubenswrapper[4776]: I0128 07:07:53.696281 4776 generic.go:334] "Generic (PLEG): container finished" podID="cea16506-5466-4c4b-962f-a4cf2d233c93" containerID="a9c5000a28f314609eadfc4c02e1cc1c08f184777067e28b1ad5b5e2b1d2b750" exitCode=143 Jan 28 07:07:53 crc kubenswrapper[4776]: I0128 07:07:53.695758 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cea16506-5466-4c4b-962f-a4cf2d233c93","Type":"ContainerDied","Data":"45221c24d2c63a782c9a8645d56674fbeea6852cbaa28c94f4755bbd5e9aa5c5"} Jan 28 07:07:53 crc kubenswrapper[4776]: I0128 07:07:53.696387 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cea16506-5466-4c4b-962f-a4cf2d233c93","Type":"ContainerDied","Data":"a9c5000a28f314609eadfc4c02e1cc1c08f184777067e28b1ad5b5e2b1d2b750"} Jan 28 07:07:53 crc kubenswrapper[4776]: I0128 07:07:53.700590 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b","Type":"ContainerStarted","Data":"ca5a8b3804762f9404ba6e324e6030336b870c0929fee944732f4bce4f7e8bd2"} Jan 28 07:07:53 crc kubenswrapper[4776]: I0128 07:07:53.700773 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" containerName="glance-log" containerID="cri-o://9a0429a49f30cfbbb0fc0eb60c73fc15c585b0a18ffca045b308134e7963123f" gracePeriod=30 Jan 28 07:07:53 crc kubenswrapper[4776]: I0128 07:07:53.700933 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" containerName="glance-httpd" containerID="cri-o://ca5a8b3804762f9404ba6e324e6030336b870c0929fee944732f4bce4f7e8bd2" gracePeriod=30 Jan 28 07:07:53 crc kubenswrapper[4776]: I0128 07:07:53.730176 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.730157655 podStartE2EDuration="5.730157655s" podCreationTimestamp="2026-01-28 07:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:07:53.717233235 +0000 UTC m=+1045.132893395" watchObservedRunningTime="2026-01-28 07:07:53.730157655 +0000 UTC m=+1045.145817815" Jan 28 07:07:54 crc kubenswrapper[4776]: I0128 07:07:54.708698 4776 generic.go:334] "Generic (PLEG): container finished" podID="bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" containerID="ca5a8b3804762f9404ba6e324e6030336b870c0929fee944732f4bce4f7e8bd2" exitCode=0 Jan 28 07:07:54 crc kubenswrapper[4776]: I0128 07:07:54.708959 4776 generic.go:334] "Generic (PLEG): container finished" podID="bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" containerID="9a0429a49f30cfbbb0fc0eb60c73fc15c585b0a18ffca045b308134e7963123f" exitCode=143 Jan 28 07:07:54 crc kubenswrapper[4776]: I0128 07:07:54.708996 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b","Type":"ContainerDied","Data":"ca5a8b3804762f9404ba6e324e6030336b870c0929fee944732f4bce4f7e8bd2"} Jan 28 07:07:54 crc kubenswrapper[4776]: I0128 07:07:54.709020 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b","Type":"ContainerDied","Data":"9a0429a49f30cfbbb0fc0eb60c73fc15c585b0a18ffca045b308134e7963123f"} Jan 28 07:07:54 crc kubenswrapper[4776]: I0128 07:07:54.711317 4776 generic.go:334] "Generic (PLEG): container finished" podID="082b9f0e-3a87-4a1f-82fc-08f8c195e054" containerID="251ba1f110367388593330d151d1c88c4d914c55af6a2fd2b3c04e541b32ef84" exitCode=0 Jan 28 07:07:54 crc kubenswrapper[4776]: I0128 07:07:54.711339 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m6h7p" event={"ID":"082b9f0e-3a87-4a1f-82fc-08f8c195e054","Type":"ContainerDied","Data":"251ba1f110367388593330d151d1c88c4d914c55af6a2fd2b3c04e541b32ef84"} Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.530336 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5bd69c9-nkv6n"] Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.588020 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c7f79f5b8-2xn7l"] Jan 28 07:07:56 crc kubenswrapper[4776]: E0128 07:07:56.588436 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1517118c-3acb-4a6f-9e69-74ed0092345b" containerName="init" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.588448 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1517118c-3acb-4a6f-9e69-74ed0092345b" containerName="init" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.588862 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1517118c-3acb-4a6f-9e69-74ed0092345b" containerName="init" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.589775 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.593374 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.613046 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c7f79f5b8-2xn7l"] Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.636415 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7fb9dfd4f7-bpfgk"] Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.637344 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-horizon-secret-key\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.637411 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-scripts\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.637496 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-logs\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.637529 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-horizon-tls-certs\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.637575 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-combined-ca-bundle\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.637625 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-config-data\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.637758 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drrzk\" (UniqueName: \"kubernetes.io/projected/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-kube-api-access-drrzk\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.676456 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-755fdfc784-krn2x"] Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.677950 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.691248 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-755fdfc784-krn2x"] Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.740282 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drrzk\" (UniqueName: \"kubernetes.io/projected/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-kube-api-access-drrzk\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.740360 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-horizon-secret-key\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.740383 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc39478f-fee2-4eb1-89bc-789b5179a1ca-horizon-tls-certs\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.740405 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc39478f-fee2-4eb1-89bc-789b5179a1ca-config-data\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.740464 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc39478f-fee2-4eb1-89bc-789b5179a1ca-logs\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.740484 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-scripts\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.740500 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc39478f-fee2-4eb1-89bc-789b5179a1ca-scripts\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.740531 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-logs\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.740562 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fjz8\" (UniqueName: \"kubernetes.io/projected/dc39478f-fee2-4eb1-89bc-789b5179a1ca-kube-api-access-9fjz8\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.740589 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-horizon-tls-certs\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.740605 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-combined-ca-bundle\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.740627 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dc39478f-fee2-4eb1-89bc-789b5179a1ca-horizon-secret-key\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.740660 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc39478f-fee2-4eb1-89bc-789b5179a1ca-combined-ca-bundle\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.740676 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-config-data\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.741888 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-config-data\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.742820 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-logs\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.742840 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-scripts\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.749114 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-combined-ca-bundle\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.749769 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-horizon-secret-key\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.751948 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-horizon-tls-certs\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.765992 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drrzk\" (UniqueName: \"kubernetes.io/projected/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-kube-api-access-drrzk\") pod \"horizon-7c7f79f5b8-2xn7l\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.841700 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc39478f-fee2-4eb1-89bc-789b5179a1ca-horizon-tls-certs\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.841748 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc39478f-fee2-4eb1-89bc-789b5179a1ca-config-data\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.841769 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc39478f-fee2-4eb1-89bc-789b5179a1ca-logs\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.841794 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc39478f-fee2-4eb1-89bc-789b5179a1ca-scripts\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.841843 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fjz8\" (UniqueName: \"kubernetes.io/projected/dc39478f-fee2-4eb1-89bc-789b5179a1ca-kube-api-access-9fjz8\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.841867 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dc39478f-fee2-4eb1-89bc-789b5179a1ca-horizon-secret-key\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.841898 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc39478f-fee2-4eb1-89bc-789b5179a1ca-combined-ca-bundle\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.842500 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc39478f-fee2-4eb1-89bc-789b5179a1ca-logs\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.845459 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc39478f-fee2-4eb1-89bc-789b5179a1ca-config-data\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.845993 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc39478f-fee2-4eb1-89bc-789b5179a1ca-scripts\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.847415 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc39478f-fee2-4eb1-89bc-789b5179a1ca-combined-ca-bundle\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.849355 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc39478f-fee2-4eb1-89bc-789b5179a1ca-horizon-tls-certs\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.850434 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dc39478f-fee2-4eb1-89bc-789b5179a1ca-horizon-secret-key\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.858949 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fjz8\" (UniqueName: \"kubernetes.io/projected/dc39478f-fee2-4eb1-89bc-789b5179a1ca-kube-api-access-9fjz8\") pod \"horizon-755fdfc784-krn2x\" (UID: \"dc39478f-fee2-4eb1-89bc-789b5179a1ca\") " pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.939078 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:07:56 crc kubenswrapper[4776]: I0128 07:07:56.996979 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:07:59 crc kubenswrapper[4776]: I0128 07:07:59.128651 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:07:59 crc kubenswrapper[4776]: I0128 07:07:59.204695 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-hf97r"] Jan 28 07:07:59 crc kubenswrapper[4776]: I0128 07:07:59.205021 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" podUID="4cb1a460-72c7-4fc9-9a41-f92d30d63444" containerName="dnsmasq-dns" containerID="cri-o://f6e78ddf1d896fb0f6b71d14e9adc6d134ce7ec5db4388a24ae61e954d2df014" gracePeriod=10 Jan 28 07:07:59 crc kubenswrapper[4776]: I0128 07:07:59.775413 4776 generic.go:334] "Generic (PLEG): container finished" podID="4cb1a460-72c7-4fc9-9a41-f92d30d63444" containerID="f6e78ddf1d896fb0f6b71d14e9adc6d134ce7ec5db4388a24ae61e954d2df014" exitCode=0 Jan 28 07:07:59 crc kubenswrapper[4776]: I0128 07:07:59.775507 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" event={"ID":"4cb1a460-72c7-4fc9-9a41-f92d30d63444","Type":"ContainerDied","Data":"f6e78ddf1d896fb0f6b71d14e9adc6d134ce7ec5db4388a24ae61e954d2df014"} Jan 28 07:08:01 crc kubenswrapper[4776]: I0128 07:08:01.770388 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" podUID="4cb1a460-72c7-4fc9-9a41-f92d30d63444" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Jan 28 07:08:03 crc kubenswrapper[4776]: I0128 07:08:03.852457 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:08:03 crc kubenswrapper[4776]: I0128 07:08:03.853053 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:08:03 crc kubenswrapper[4776]: I0128 07:08:03.853140 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 07:08:03 crc kubenswrapper[4776]: I0128 07:08:03.854953 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ee9888a6c6a796ef3ecd16fb5509f4cb1473705dc001b450840175052867c944"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 07:08:03 crc kubenswrapper[4776]: I0128 07:08:03.855068 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://ee9888a6c6a796ef3ecd16fb5509f4cb1473705dc001b450840175052867c944" gracePeriod=600 Jan 28 07:08:04 crc kubenswrapper[4776]: I0128 07:08:04.827088 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="ee9888a6c6a796ef3ecd16fb5509f4cb1473705dc001b450840175052867c944" exitCode=0 Jan 28 07:08:04 crc kubenswrapper[4776]: I0128 07:08:04.827213 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"ee9888a6c6a796ef3ecd16fb5509f4cb1473705dc001b450840175052867c944"} Jan 28 07:08:04 crc kubenswrapper[4776]: I0128 07:08:04.827720 4776 scope.go:117] "RemoveContainer" containerID="3a4b923d003b08375151203705e90fc5cb4620832d4a2d02a6cb87b79047a42d" Jan 28 07:08:05 crc kubenswrapper[4776]: E0128 07:08:05.757495 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Jan 28 07:08:05 crc kubenswrapper[4776]: E0128 07:08:05.757694 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n67fh65bhf9h75h674h5f8h64fhdh56dh54bh5d8hf9hdbh5bch68ch654h648hcdhbh576hcdh695hd7h56ch5bchc7hdbh99h646h5b6h5fdh668q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-68gbn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5bd69c9-nkv6n_openstack(552e1ac3-1b3f-4480-820b-ee07c76efd64): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 07:08:05 crc kubenswrapper[4776]: E0128 07:08:05.760414 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5bd69c9-nkv6n" podUID="552e1ac3-1b3f-4480-820b-ee07c76efd64" Jan 28 07:08:05 crc kubenswrapper[4776]: E0128 07:08:05.765188 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Jan 28 07:08:05 crc kubenswrapper[4776]: E0128 07:08:05.765356 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n87h54bh5bbh5b4h5dbh98h65dh67fh78hcdh586h87h5bch5d6h5dch669h9fh6fh68fh95h75h57bh59bh5ffh85hd6h5d9h5d6h566h588hc8h6dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q5f4m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7fb9dfd4f7-bpfgk_openstack(3acb133b-df04-4bed-bf37-fe0c45c08dc7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 07:08:05 crc kubenswrapper[4776]: E0128 07:08:05.767429 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7fb9dfd4f7-bpfgk" podUID="3acb133b-df04-4bed-bf37-fe0c45c08dc7" Jan 28 07:08:05 crc kubenswrapper[4776]: I0128 07:08:05.838827 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cea16506-5466-4c4b-962f-a4cf2d233c93","Type":"ContainerDied","Data":"b31968c8a68fb7a505ce5dd788092e3c74df9877454bf80aaad836a7d136bada"} Jan 28 07:08:05 crc kubenswrapper[4776]: I0128 07:08:05.838867 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b31968c8a68fb7a505ce5dd788092e3c74df9877454bf80aaad836a7d136bada" Jan 28 07:08:05 crc kubenswrapper[4776]: I0128 07:08:05.948954 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.046577 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-scripts\") pod \"cea16506-5466-4c4b-962f-a4cf2d233c93\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.046661 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgvlp\" (UniqueName: \"kubernetes.io/projected/cea16506-5466-4c4b-962f-a4cf2d233c93-kube-api-access-pgvlp\") pod \"cea16506-5466-4c4b-962f-a4cf2d233c93\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.046695 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea16506-5466-4c4b-962f-a4cf2d233c93-logs\") pod \"cea16506-5466-4c4b-962f-a4cf2d233c93\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.046714 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-combined-ca-bundle\") pod \"cea16506-5466-4c4b-962f-a4cf2d233c93\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.046763 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-public-tls-certs\") pod \"cea16506-5466-4c4b-962f-a4cf2d233c93\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.046778 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cea16506-5466-4c4b-962f-a4cf2d233c93\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.046829 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-config-data\") pod \"cea16506-5466-4c4b-962f-a4cf2d233c93\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.046851 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cea16506-5466-4c4b-962f-a4cf2d233c93-httpd-run\") pod \"cea16506-5466-4c4b-962f-a4cf2d233c93\" (UID: \"cea16506-5466-4c4b-962f-a4cf2d233c93\") " Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.048134 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea16506-5466-4c4b-962f-a4cf2d233c93-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cea16506-5466-4c4b-962f-a4cf2d233c93" (UID: "cea16506-5466-4c4b-962f-a4cf2d233c93"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.052068 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea16506-5466-4c4b-962f-a4cf2d233c93-logs" (OuterVolumeSpecName: "logs") pod "cea16506-5466-4c4b-962f-a4cf2d233c93" (UID: "cea16506-5466-4c4b-962f-a4cf2d233c93"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.053730 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "cea16506-5466-4c4b-962f-a4cf2d233c93" (UID: "cea16506-5466-4c4b-962f-a4cf2d233c93"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.053967 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea16506-5466-4c4b-962f-a4cf2d233c93-kube-api-access-pgvlp" (OuterVolumeSpecName: "kube-api-access-pgvlp") pod "cea16506-5466-4c4b-962f-a4cf2d233c93" (UID: "cea16506-5466-4c4b-962f-a4cf2d233c93"). InnerVolumeSpecName "kube-api-access-pgvlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.060064 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-scripts" (OuterVolumeSpecName: "scripts") pod "cea16506-5466-4c4b-962f-a4cf2d233c93" (UID: "cea16506-5466-4c4b-962f-a4cf2d233c93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.086398 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cea16506-5466-4c4b-962f-a4cf2d233c93" (UID: "cea16506-5466-4c4b-962f-a4cf2d233c93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.109287 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-config-data" (OuterVolumeSpecName: "config-data") pod "cea16506-5466-4c4b-962f-a4cf2d233c93" (UID: "cea16506-5466-4c4b-962f-a4cf2d233c93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.121740 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cea16506-5466-4c4b-962f-a4cf2d233c93" (UID: "cea16506-5466-4c4b-962f-a4cf2d233c93"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.148365 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgvlp\" (UniqueName: \"kubernetes.io/projected/cea16506-5466-4c4b-962f-a4cf2d233c93-kube-api-access-pgvlp\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.148393 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea16506-5466-4c4b-962f-a4cf2d233c93-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.148404 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.148412 4776 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.148436 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.148445 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.148456 4776 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cea16506-5466-4c4b-962f-a4cf2d233c93-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.148464 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea16506-5466-4c4b-962f-a4cf2d233c93-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.170369 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.250121 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.769942 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" podUID="4cb1a460-72c7-4fc9-9a41-f92d30d63444" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.846667 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.888984 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.922460 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.937887 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 07:08:06 crc kubenswrapper[4776]: E0128 07:08:06.939022 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea16506-5466-4c4b-962f-a4cf2d233c93" containerName="glance-httpd" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.939043 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea16506-5466-4c4b-962f-a4cf2d233c93" containerName="glance-httpd" Jan 28 07:08:06 crc kubenswrapper[4776]: E0128 07:08:06.939058 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea16506-5466-4c4b-962f-a4cf2d233c93" containerName="glance-log" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.939065 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea16506-5466-4c4b-962f-a4cf2d233c93" containerName="glance-log" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.939254 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea16506-5466-4c4b-962f-a4cf2d233c93" containerName="glance-log" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.939282 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea16506-5466-4c4b-962f-a4cf2d233c93" containerName="glance-httpd" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.948717 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.955325 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.955510 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 28 07:08:06 crc kubenswrapper[4776]: I0128 07:08:06.957192 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.064427 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.064514 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgb7g\" (UniqueName: \"kubernetes.io/projected/b7d5d2aa-a358-4412-8668-444842b2bdc5-kube-api-access-fgb7g\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.064595 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-config-data\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.064650 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.064671 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d5d2aa-a358-4412-8668-444842b2bdc5-logs\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.064747 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.064779 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7d5d2aa-a358-4412-8668-444842b2bdc5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.064844 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-scripts\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.166191 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgb7g\" (UniqueName: \"kubernetes.io/projected/b7d5d2aa-a358-4412-8668-444842b2bdc5-kube-api-access-fgb7g\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.166260 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-config-data\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.166293 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.166310 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d5d2aa-a358-4412-8668-444842b2bdc5-logs\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.166348 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.166369 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7d5d2aa-a358-4412-8668-444842b2bdc5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.166416 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-scripts\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.166446 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.166857 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.171356 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d5d2aa-a358-4412-8668-444842b2bdc5-logs\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.171678 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7d5d2aa-a358-4412-8668-444842b2bdc5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.182422 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgb7g\" (UniqueName: \"kubernetes.io/projected/b7d5d2aa-a358-4412-8668-444842b2bdc5-kube-api-access-fgb7g\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.190524 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.200195 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.206871 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-config-data\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.207296 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-scripts\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.212336 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.297574 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.318139 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cea16506-5466-4c4b-962f-a4cf2d233c93" path="/var/lib/kubelet/pods/cea16506-5466-4c4b-962f-a4cf2d233c93/volumes" Jan 28 07:08:07 crc kubenswrapper[4776]: E0128 07:08:07.553523 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Jan 28 07:08:07 crc kubenswrapper[4776]: E0128 07:08:07.553712 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmmfj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-c8whn_openstack(6728fb0a-d0b8-4fd0-970e-7e5e496ecd03): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 07:08:07 crc kubenswrapper[4776]: E0128 07:08:07.555307 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-c8whn" podUID="6728fb0a-d0b8-4fd0-970e-7e5e496ecd03" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.659331 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.777565 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-fernet-keys\") pod \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.777613 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-credential-keys\") pod \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.777706 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-combined-ca-bundle\") pod \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.777767 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mr7m\" (UniqueName: \"kubernetes.io/projected/082b9f0e-3a87-4a1f-82fc-08f8c195e054-kube-api-access-5mr7m\") pod \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.777823 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-config-data\") pod \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.777893 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-scripts\") pod \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\" (UID: \"082b9f0e-3a87-4a1f-82fc-08f8c195e054\") " Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.782011 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/082b9f0e-3a87-4a1f-82fc-08f8c195e054-kube-api-access-5mr7m" (OuterVolumeSpecName: "kube-api-access-5mr7m") pod "082b9f0e-3a87-4a1f-82fc-08f8c195e054" (UID: "082b9f0e-3a87-4a1f-82fc-08f8c195e054"). InnerVolumeSpecName "kube-api-access-5mr7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.783464 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-scripts" (OuterVolumeSpecName: "scripts") pod "082b9f0e-3a87-4a1f-82fc-08f8c195e054" (UID: "082b9f0e-3a87-4a1f-82fc-08f8c195e054"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.783518 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "082b9f0e-3a87-4a1f-82fc-08f8c195e054" (UID: "082b9f0e-3a87-4a1f-82fc-08f8c195e054"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.787495 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "082b9f0e-3a87-4a1f-82fc-08f8c195e054" (UID: "082b9f0e-3a87-4a1f-82fc-08f8c195e054"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.842093 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "082b9f0e-3a87-4a1f-82fc-08f8c195e054" (UID: "082b9f0e-3a87-4a1f-82fc-08f8c195e054"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.842464 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-config-data" (OuterVolumeSpecName: "config-data") pod "082b9f0e-3a87-4a1f-82fc-08f8c195e054" (UID: "082b9f0e-3a87-4a1f-82fc-08f8c195e054"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.859440 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m6h7p" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.859507 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m6h7p" event={"ID":"082b9f0e-3a87-4a1f-82fc-08f8c195e054","Type":"ContainerDied","Data":"291bb0f3b2de5177787f79b3112808fcc747ce894505dee47b6990c44a18f6d5"} Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.859540 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="291bb0f3b2de5177787f79b3112808fcc747ce894505dee47b6990c44a18f6d5" Jan 28 07:08:07 crc kubenswrapper[4776]: E0128 07:08:07.861066 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-c8whn" podUID="6728fb0a-d0b8-4fd0-970e-7e5e496ecd03" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.880675 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.880708 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mr7m\" (UniqueName: \"kubernetes.io/projected/082b9f0e-3a87-4a1f-82fc-08f8c195e054-kube-api-access-5mr7m\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.880720 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.880731 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.880745 4776 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:07 crc kubenswrapper[4776]: I0128 07:08:07.880756 4776 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/082b9f0e-3a87-4a1f-82fc-08f8c195e054-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:08 crc kubenswrapper[4776]: I0128 07:08:08.747611 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-m6h7p"] Jan 28 07:08:08 crc kubenswrapper[4776]: I0128 07:08:08.754714 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-m6h7p"] Jan 28 07:08:08 crc kubenswrapper[4776]: I0128 07:08:08.853859 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5cgcf"] Jan 28 07:08:08 crc kubenswrapper[4776]: E0128 07:08:08.854360 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082b9f0e-3a87-4a1f-82fc-08f8c195e054" containerName="keystone-bootstrap" Jan 28 07:08:08 crc kubenswrapper[4776]: I0128 07:08:08.854377 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="082b9f0e-3a87-4a1f-82fc-08f8c195e054" containerName="keystone-bootstrap" Jan 28 07:08:08 crc kubenswrapper[4776]: I0128 07:08:08.854626 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="082b9f0e-3a87-4a1f-82fc-08f8c195e054" containerName="keystone-bootstrap" Jan 28 07:08:08 crc kubenswrapper[4776]: I0128 07:08:08.863886 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5cgcf"] Jan 28 07:08:08 crc kubenswrapper[4776]: I0128 07:08:08.863981 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:08 crc kubenswrapper[4776]: I0128 07:08:08.867225 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-52vd7" Jan 28 07:08:08 crc kubenswrapper[4776]: I0128 07:08:08.868603 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 28 07:08:08 crc kubenswrapper[4776]: I0128 07:08:08.868812 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 28 07:08:08 crc kubenswrapper[4776]: I0128 07:08:08.868967 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 28 07:08:08 crc kubenswrapper[4776]: I0128 07:08:08.869094 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 28 07:08:08 crc kubenswrapper[4776]: I0128 07:08:08.870406 4776 generic.go:334] "Generic (PLEG): container finished" podID="dd2e30fd-49a7-4182-8e64-72c01a2394d4" containerID="e3e40e8e3d347ea5c8c72bdefb22ddda79422d389b6094b60f9668c9128dfb32" exitCode=0 Jan 28 07:08:08 crc kubenswrapper[4776]: I0128 07:08:08.870439 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q4gsk" event={"ID":"dd2e30fd-49a7-4182-8e64-72c01a2394d4","Type":"ContainerDied","Data":"e3e40e8e3d347ea5c8c72bdefb22ddda79422d389b6094b60f9668c9128dfb32"} Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.008022 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-scripts\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.008079 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-combined-ca-bundle\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.008211 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-credential-keys\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.008280 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-fernet-keys\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.008381 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-config-data\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.008432 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrcmh\" (UniqueName: \"kubernetes.io/projected/e1eea745-adc8-4e45-b52a-48190c7572b1-kube-api-access-qrcmh\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.109501 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-combined-ca-bundle\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.109604 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-credential-keys\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.109647 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-fernet-keys\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.109707 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-config-data\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.109755 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrcmh\" (UniqueName: \"kubernetes.io/projected/e1eea745-adc8-4e45-b52a-48190c7572b1-kube-api-access-qrcmh\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.109828 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-scripts\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.114452 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-scripts\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.115208 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-config-data\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.115475 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-credential-keys\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.116178 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-fernet-keys\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.125467 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-combined-ca-bundle\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.126575 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrcmh\" (UniqueName: \"kubernetes.io/projected/e1eea745-adc8-4e45-b52a-48190c7572b1-kube-api-access-qrcmh\") pod \"keystone-bootstrap-5cgcf\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.191753 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:09 crc kubenswrapper[4776]: I0128 07:08:09.314895 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="082b9f0e-3a87-4a1f-82fc-08f8c195e054" path="/var/lib/kubelet/pods/082b9f0e-3a87-4a1f-82fc-08f8c195e054/volumes" Jan 28 07:08:14 crc kubenswrapper[4776]: E0128 07:08:14.811156 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Jan 28 07:08:14 crc kubenswrapper[4776]: E0128 07:08:14.811740 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4jnkb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-fg2nt_openstack(6377f80e-0b32-479e-b33c-fc4d9f67b299): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 07:08:14 crc kubenswrapper[4776]: E0128 07:08:14.812963 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-fg2nt" podUID="6377f80e-0b32-479e-b33c-fc4d9f67b299" Jan 28 07:08:14 crc kubenswrapper[4776]: I0128 07:08:14.953452 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bd69c9-nkv6n" event={"ID":"552e1ac3-1b3f-4480-820b-ee07c76efd64","Type":"ContainerDied","Data":"02c8cb29920584899d1dd7d0761e823bfba2dd5e326d26f1fb6a5c2acfb1de97"} Jan 28 07:08:14 crc kubenswrapper[4776]: I0128 07:08:14.953841 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02c8cb29920584899d1dd7d0761e823bfba2dd5e326d26f1fb6a5c2acfb1de97" Jan 28 07:08:14 crc kubenswrapper[4776]: I0128 07:08:14.955271 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q4gsk" event={"ID":"dd2e30fd-49a7-4182-8e64-72c01a2394d4","Type":"ContainerDied","Data":"07b88a56f1665edfe9b8785381c4527bfb45888039083d542003b3a44e9a7cdf"} Jan 28 07:08:14 crc kubenswrapper[4776]: I0128 07:08:14.955350 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07b88a56f1665edfe9b8785381c4527bfb45888039083d542003b3a44e9a7cdf" Jan 28 07:08:14 crc kubenswrapper[4776]: I0128 07:08:14.959061 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" event={"ID":"4cb1a460-72c7-4fc9-9a41-f92d30d63444","Type":"ContainerDied","Data":"2071f6840968e3b8a24e2c9d352df16170c5d0fbc373518d35d43437e6c3afb9"} Jan 28 07:08:14 crc kubenswrapper[4776]: I0128 07:08:14.959114 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2071f6840968e3b8a24e2c9d352df16170c5d0fbc373518d35d43437e6c3afb9" Jan 28 07:08:14 crc kubenswrapper[4776]: I0128 07:08:14.960720 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb9dfd4f7-bpfgk" event={"ID":"3acb133b-df04-4bed-bf37-fe0c45c08dc7","Type":"ContainerDied","Data":"a8146ea734b788d1429072a88582845b564a95e2c4192aa46ec6fae49696ec64"} Jan 28 07:08:14 crc kubenswrapper[4776]: I0128 07:08:14.960763 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8146ea734b788d1429072a88582845b564a95e2c4192aa46ec6fae49696ec64" Jan 28 07:08:14 crc kubenswrapper[4776]: I0128 07:08:14.963505 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b","Type":"ContainerDied","Data":"826426b902de0ddd0efe938d539c9561d150c3febd0756bd34141352fa07ec87"} Jan 28 07:08:14 crc kubenswrapper[4776]: I0128 07:08:14.963588 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="826426b902de0ddd0efe938d539c9561d150c3febd0756bd34141352fa07ec87" Jan 28 07:08:14 crc kubenswrapper[4776]: I0128 07:08:14.963616 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:08:14 crc kubenswrapper[4776]: E0128 07:08:14.967990 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-fg2nt" podUID="6377f80e-0b32-479e-b33c-fc4d9f67b299" Jan 28 07:08:14 crc kubenswrapper[4776]: I0128 07:08:14.971487 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:08:14 crc kubenswrapper[4776]: I0128 07:08:14.995971 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:08:14 crc kubenswrapper[4776]: I0128 07:08:14.997321 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.008208 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q4gsk" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.128505 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/552e1ac3-1b3f-4480-820b-ee07c76efd64-horizon-secret-key\") pod \"552e1ac3-1b3f-4480-820b-ee07c76efd64\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.128603 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhcgj\" (UniqueName: \"kubernetes.io/projected/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-kube-api-access-nhcgj\") pod \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.128665 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-config-data\") pod \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.128691 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd2e30fd-49a7-4182-8e64-72c01a2394d4-config\") pod \"dd2e30fd-49a7-4182-8e64-72c01a2394d4\" (UID: \"dd2e30fd-49a7-4182-8e64-72c01a2394d4\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.128715 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-combined-ca-bundle\") pod \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.128758 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-dns-swift-storage-0\") pod \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.128786 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.128815 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69j9m\" (UniqueName: \"kubernetes.io/projected/4cb1a460-72c7-4fc9-9a41-f92d30d63444-kube-api-access-69j9m\") pod \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.128867 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/552e1ac3-1b3f-4480-820b-ee07c76efd64-scripts\") pod \"552e1ac3-1b3f-4480-820b-ee07c76efd64\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.128895 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-ovsdbserver-nb\") pod \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.128968 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68gbn\" (UniqueName: \"kubernetes.io/projected/552e1ac3-1b3f-4480-820b-ee07c76efd64-kube-api-access-68gbn\") pod \"552e1ac3-1b3f-4480-820b-ee07c76efd64\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.128997 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/552e1ac3-1b3f-4480-820b-ee07c76efd64-logs\") pod \"552e1ac3-1b3f-4480-820b-ee07c76efd64\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.129019 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3acb133b-df04-4bed-bf37-fe0c45c08dc7-logs\") pod \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.129044 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbrf9\" (UniqueName: \"kubernetes.io/projected/dd2e30fd-49a7-4182-8e64-72c01a2394d4-kube-api-access-mbrf9\") pod \"dd2e30fd-49a7-4182-8e64-72c01a2394d4\" (UID: \"dd2e30fd-49a7-4182-8e64-72c01a2394d4\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.129083 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5f4m\" (UniqueName: \"kubernetes.io/projected/3acb133b-df04-4bed-bf37-fe0c45c08dc7-kube-api-access-q5f4m\") pod \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.129120 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/552e1ac3-1b3f-4480-820b-ee07c76efd64-config-data\") pod \"552e1ac3-1b3f-4480-820b-ee07c76efd64\" (UID: \"552e1ac3-1b3f-4480-820b-ee07c76efd64\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.129149 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2e30fd-49a7-4182-8e64-72c01a2394d4-combined-ca-bundle\") pod \"dd2e30fd-49a7-4182-8e64-72c01a2394d4\" (UID: \"dd2e30fd-49a7-4182-8e64-72c01a2394d4\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.129185 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-config\") pod \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.129276 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3acb133b-df04-4bed-bf37-fe0c45c08dc7-config-data\") pod \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.129303 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-internal-tls-certs\") pod \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.129325 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3acb133b-df04-4bed-bf37-fe0c45c08dc7-scripts\") pod \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.129359 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-ovsdbserver-sb\") pod \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.129379 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3acb133b-df04-4bed-bf37-fe0c45c08dc7-horizon-secret-key\") pod \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\" (UID: \"3acb133b-df04-4bed-bf37-fe0c45c08dc7\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.129407 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-httpd-run\") pod \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.129429 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-scripts\") pod \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.129454 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-logs\") pod \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\" (UID: \"bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.129483 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-dns-svc\") pod \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\" (UID: \"4cb1a460-72c7-4fc9-9a41-f92d30d63444\") " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.129372 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3acb133b-df04-4bed-bf37-fe0c45c08dc7-logs" (OuterVolumeSpecName: "logs") pod "3acb133b-df04-4bed-bf37-fe0c45c08dc7" (UID: "3acb133b-df04-4bed-bf37-fe0c45c08dc7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.129609 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/552e1ac3-1b3f-4480-820b-ee07c76efd64-logs" (OuterVolumeSpecName: "logs") pod "552e1ac3-1b3f-4480-820b-ee07c76efd64" (UID: "552e1ac3-1b3f-4480-820b-ee07c76efd64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.130193 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/552e1ac3-1b3f-4480-820b-ee07c76efd64-scripts" (OuterVolumeSpecName: "scripts") pod "552e1ac3-1b3f-4480-820b-ee07c76efd64" (UID: "552e1ac3-1b3f-4480-820b-ee07c76efd64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.130982 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3acb133b-df04-4bed-bf37-fe0c45c08dc7-scripts" (OuterVolumeSpecName: "scripts") pod "3acb133b-df04-4bed-bf37-fe0c45c08dc7" (UID: "3acb133b-df04-4bed-bf37-fe0c45c08dc7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.133445 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/552e1ac3-1b3f-4480-820b-ee07c76efd64-config-data" (OuterVolumeSpecName: "config-data") pod "552e1ac3-1b3f-4480-820b-ee07c76efd64" (UID: "552e1ac3-1b3f-4480-820b-ee07c76efd64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.134484 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" (UID: "bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.134585 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552e1ac3-1b3f-4480-820b-ee07c76efd64-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "552e1ac3-1b3f-4480-820b-ee07c76efd64" (UID: "552e1ac3-1b3f-4480-820b-ee07c76efd64"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.134749 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cb1a460-72c7-4fc9-9a41-f92d30d63444-kube-api-access-69j9m" (OuterVolumeSpecName: "kube-api-access-69j9m") pod "4cb1a460-72c7-4fc9-9a41-f92d30d63444" (UID: "4cb1a460-72c7-4fc9-9a41-f92d30d63444"). InnerVolumeSpecName "kube-api-access-69j9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.135752 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/552e1ac3-1b3f-4480-820b-ee07c76efd64-kube-api-access-68gbn" (OuterVolumeSpecName: "kube-api-access-68gbn") pod "552e1ac3-1b3f-4480-820b-ee07c76efd64" (UID: "552e1ac3-1b3f-4480-820b-ee07c76efd64"). InnerVolumeSpecName "kube-api-access-68gbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.137108 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd2e30fd-49a7-4182-8e64-72c01a2394d4-kube-api-access-mbrf9" (OuterVolumeSpecName: "kube-api-access-mbrf9") pod "dd2e30fd-49a7-4182-8e64-72c01a2394d4" (UID: "dd2e30fd-49a7-4182-8e64-72c01a2394d4"). InnerVolumeSpecName "kube-api-access-mbrf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.137431 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-scripts" (OuterVolumeSpecName: "scripts") pod "bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" (UID: "bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.137973 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" (UID: "bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.138497 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-logs" (OuterVolumeSpecName: "logs") pod "bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" (UID: "bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.138993 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-kube-api-access-nhcgj" (OuterVolumeSpecName: "kube-api-access-nhcgj") pod "bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" (UID: "bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b"). InnerVolumeSpecName "kube-api-access-nhcgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.139919 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3acb133b-df04-4bed-bf37-fe0c45c08dc7-config-data" (OuterVolumeSpecName: "config-data") pod "3acb133b-df04-4bed-bf37-fe0c45c08dc7" (UID: "3acb133b-df04-4bed-bf37-fe0c45c08dc7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.139922 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3acb133b-df04-4bed-bf37-fe0c45c08dc7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3acb133b-df04-4bed-bf37-fe0c45c08dc7" (UID: "3acb133b-df04-4bed-bf37-fe0c45c08dc7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.153714 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3acb133b-df04-4bed-bf37-fe0c45c08dc7-kube-api-access-q5f4m" (OuterVolumeSpecName: "kube-api-access-q5f4m") pod "3acb133b-df04-4bed-bf37-fe0c45c08dc7" (UID: "3acb133b-df04-4bed-bf37-fe0c45c08dc7"). InnerVolumeSpecName "kube-api-access-q5f4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.164912 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2e30fd-49a7-4182-8e64-72c01a2394d4-config" (OuterVolumeSpecName: "config") pod "dd2e30fd-49a7-4182-8e64-72c01a2394d4" (UID: "dd2e30fd-49a7-4182-8e64-72c01a2394d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.169923 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2e30fd-49a7-4182-8e64-72c01a2394d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd2e30fd-49a7-4182-8e64-72c01a2394d4" (UID: "dd2e30fd-49a7-4182-8e64-72c01a2394d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.180804 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" (UID: "bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.191442 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-config" (OuterVolumeSpecName: "config") pod "4cb1a460-72c7-4fc9-9a41-f92d30d63444" (UID: "4cb1a460-72c7-4fc9-9a41-f92d30d63444"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.196670 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4cb1a460-72c7-4fc9-9a41-f92d30d63444" (UID: "4cb1a460-72c7-4fc9-9a41-f92d30d63444"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.198435 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4cb1a460-72c7-4fc9-9a41-f92d30d63444" (UID: "4cb1a460-72c7-4fc9-9a41-f92d30d63444"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.200170 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4cb1a460-72c7-4fc9-9a41-f92d30d63444" (UID: "4cb1a460-72c7-4fc9-9a41-f92d30d63444"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.202333 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-config-data" (OuterVolumeSpecName: "config-data") pod "bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" (UID: "bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.202746 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" (UID: "bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.206488 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4cb1a460-72c7-4fc9-9a41-f92d30d63444" (UID: "4cb1a460-72c7-4fc9-9a41-f92d30d63444"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234347 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234382 4776 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3acb133b-df04-4bed-bf37-fe0c45c08dc7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234394 4776 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234403 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234412 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234421 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234428 4776 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/552e1ac3-1b3f-4480-820b-ee07c76efd64-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234437 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhcgj\" (UniqueName: \"kubernetes.io/projected/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-kube-api-access-nhcgj\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234447 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234454 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd2e30fd-49a7-4182-8e64-72c01a2394d4-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234462 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234469 4776 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234501 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234510 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69j9m\" (UniqueName: \"kubernetes.io/projected/4cb1a460-72c7-4fc9-9a41-f92d30d63444-kube-api-access-69j9m\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234519 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/552e1ac3-1b3f-4480-820b-ee07c76efd64-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234526 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234534 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68gbn\" (UniqueName: \"kubernetes.io/projected/552e1ac3-1b3f-4480-820b-ee07c76efd64-kube-api-access-68gbn\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234542 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/552e1ac3-1b3f-4480-820b-ee07c76efd64-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234563 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3acb133b-df04-4bed-bf37-fe0c45c08dc7-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234574 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbrf9\" (UniqueName: \"kubernetes.io/projected/dd2e30fd-49a7-4182-8e64-72c01a2394d4-kube-api-access-mbrf9\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234583 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5f4m\" (UniqueName: \"kubernetes.io/projected/3acb133b-df04-4bed-bf37-fe0c45c08dc7-kube-api-access-q5f4m\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234591 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/552e1ac3-1b3f-4480-820b-ee07c76efd64-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234598 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2e30fd-49a7-4182-8e64-72c01a2394d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234606 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cb1a460-72c7-4fc9-9a41-f92d30d63444-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234614 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3acb133b-df04-4bed-bf37-fe0c45c08dc7-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234622 4776 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.234630 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3acb133b-df04-4bed-bf37-fe0c45c08dc7-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.250324 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 28 07:08:15 crc kubenswrapper[4776]: I0128 07:08:15.335905 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.033041 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q4gsk" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.033115 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bd69c9-nkv6n" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.033147 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.033174 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb9dfd4f7-bpfgk" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.033291 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.104998 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7fb9dfd4f7-bpfgk"] Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.120830 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7fb9dfd4f7-bpfgk"] Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.159029 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5bd69c9-nkv6n"] Jan 28 07:08:16 crc kubenswrapper[4776]: E0128 07:08:16.160817 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 28 07:08:16 crc kubenswrapper[4776]: E0128 07:08:16.161096 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4frmg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-zxp7l_openstack(2c1400af-1c32-4f74-89f8-30b42dbb6c91): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 07:08:16 crc kubenswrapper[4776]: E0128 07:08:16.162319 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-zxp7l" podUID="2c1400af-1c32-4f74-89f8-30b42dbb6c91" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.170188 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5bd69c9-nkv6n"] Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.204319 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-hf97r"] Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.226224 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-hf97r"] Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.249961 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.294671 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.312536 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 07:08:16 crc kubenswrapper[4776]: E0128 07:08:16.313667 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" containerName="glance-log" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.313686 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" containerName="glance-log" Jan 28 07:08:16 crc kubenswrapper[4776]: E0128 07:08:16.313704 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb1a460-72c7-4fc9-9a41-f92d30d63444" containerName="init" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.313711 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb1a460-72c7-4fc9-9a41-f92d30d63444" containerName="init" Jan 28 07:08:16 crc kubenswrapper[4776]: E0128 07:08:16.313736 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" containerName="glance-httpd" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.313742 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" containerName="glance-httpd" Jan 28 07:08:16 crc kubenswrapper[4776]: E0128 07:08:16.313761 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb1a460-72c7-4fc9-9a41-f92d30d63444" containerName="dnsmasq-dns" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.313768 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb1a460-72c7-4fc9-9a41-f92d30d63444" containerName="dnsmasq-dns" Jan 28 07:08:16 crc kubenswrapper[4776]: E0128 07:08:16.313787 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd2e30fd-49a7-4182-8e64-72c01a2394d4" containerName="neutron-db-sync" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.313793 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd2e30fd-49a7-4182-8e64-72c01a2394d4" containerName="neutron-db-sync" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.314107 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" containerName="glance-httpd" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.314127 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb1a460-72c7-4fc9-9a41-f92d30d63444" containerName="dnsmasq-dns" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.314141 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd2e30fd-49a7-4182-8e64-72c01a2394d4" containerName="neutron-db-sync" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.314153 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" containerName="glance-log" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.315751 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.321205 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.321338 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.325835 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.438862 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-th8p7"] Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.440659 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.460447 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-th8p7"] Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.474069 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7f4d58bd76-2fnlt"] Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.477149 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.483667 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.485730 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.486246 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-5wwr9" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.487630 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.499925 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.500059 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.500118 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdzmq\" (UniqueName: \"kubernetes.io/projected/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-kube-api-access-vdzmq\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.500147 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.500197 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.500228 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.500243 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.500329 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-logs\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.501809 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f4d58bd76-2fnlt"] Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.601572 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.601883 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdz85\" (UniqueName: \"kubernetes.io/projected/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-kube-api-access-mdz85\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.601902 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p48pz\" (UniqueName: \"kubernetes.io/projected/8fcdde4b-0742-40b3-8f98-41b218f6476a-kube-api-access-p48pz\") pod \"neutron-7f4d58bd76-2fnlt\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.601923 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdzmq\" (UniqueName: \"kubernetes.io/projected/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-kube-api-access-vdzmq\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.601945 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.601970 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.601993 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.602012 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.602034 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.602053 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.602077 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.602098 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-combined-ca-bundle\") pod \"neutron-7f4d58bd76-2fnlt\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.602116 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-config\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.602134 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-config\") pod \"neutron-7f4d58bd76-2fnlt\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.602172 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-logs\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.602209 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-ovndb-tls-certs\") pod \"neutron-7f4d58bd76-2fnlt\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.602227 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-dns-svc\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.602250 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-httpd-config\") pod \"neutron-7f4d58bd76-2fnlt\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.602276 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.602666 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.605750 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.609752 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.612618 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-logs\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.615520 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.630501 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.635505 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdzmq\" (UniqueName: \"kubernetes.io/projected/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-kube-api-access-vdzmq\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.651069 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.651247 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.703430 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.703483 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.703515 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-combined-ca-bundle\") pod \"neutron-7f4d58bd76-2fnlt\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.703533 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-config\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.703566 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-config\") pod \"neutron-7f4d58bd76-2fnlt\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.703603 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-ovndb-tls-certs\") pod \"neutron-7f4d58bd76-2fnlt\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.703618 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-dns-svc\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.703642 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-httpd-config\") pod \"neutron-7f4d58bd76-2fnlt\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.703702 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdz85\" (UniqueName: \"kubernetes.io/projected/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-kube-api-access-mdz85\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.703717 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p48pz\" (UniqueName: \"kubernetes.io/projected/8fcdde4b-0742-40b3-8f98-41b218f6476a-kube-api-access-p48pz\") pod \"neutron-7f4d58bd76-2fnlt\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.703748 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.704464 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.704488 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.705126 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-dns-svc\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.705456 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.708100 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-config\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.708217 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-httpd-config\") pod \"neutron-7f4d58bd76-2fnlt\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.709433 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-combined-ca-bundle\") pod \"neutron-7f4d58bd76-2fnlt\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.709616 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-config\") pod \"neutron-7f4d58bd76-2fnlt\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.713736 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-ovndb-tls-certs\") pod \"neutron-7f4d58bd76-2fnlt\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.725055 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdz85\" (UniqueName: \"kubernetes.io/projected/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-kube-api-access-mdz85\") pod \"dnsmasq-dns-55f844cf75-th8p7\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.731876 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p48pz\" (UniqueName: \"kubernetes.io/projected/8fcdde4b-0742-40b3-8f98-41b218f6476a-kube-api-access-p48pz\") pod \"neutron-7f4d58bd76-2fnlt\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.746032 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.783714 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-hf97r" podUID="4cb1a460-72c7-4fc9-9a41-f92d30d63444" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: i/o timeout" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.797212 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.820963 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:16 crc kubenswrapper[4776]: I0128 07:08:16.956818 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-755fdfc784-krn2x"] Jan 28 07:08:16 crc kubenswrapper[4776]: W0128 07:08:16.980482 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc39478f_fee2_4eb1_89bc_789b5179a1ca.slice/crio-89f773e743d37b69274d32ef082c4eaea20b4c23ac30b2a8b199bea042ca0f68 WatchSource:0}: Error finding container 89f773e743d37b69274d32ef082c4eaea20b4c23ac30b2a8b199bea042ca0f68: Status 404 returned error can't find the container with id 89f773e743d37b69274d32ef082c4eaea20b4c23ac30b2a8b199bea042ca0f68 Jan 28 07:08:17 crc kubenswrapper[4776]: I0128 07:08:17.062307 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 07:08:17 crc kubenswrapper[4776]: I0128 07:08:17.066479 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-755fdfc784-krn2x" event={"ID":"dc39478f-fee2-4eb1-89bc-789b5179a1ca","Type":"ContainerStarted","Data":"89f773e743d37b69274d32ef082c4eaea20b4c23ac30b2a8b199bea042ca0f68"} Jan 28 07:08:17 crc kubenswrapper[4776]: I0128 07:08:17.101586 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"b457c6912252b5a2b63136be6ba788b7188f99f3764ee744c448695e511964d0"} Jan 28 07:08:17 crc kubenswrapper[4776]: E0128 07:08:17.130857 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-zxp7l" podUID="2c1400af-1c32-4f74-89f8-30b42dbb6c91" Jan 28 07:08:17 crc kubenswrapper[4776]: I0128 07:08:17.195173 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c7f79f5b8-2xn7l"] Jan 28 07:08:17 crc kubenswrapper[4776]: I0128 07:08:17.202362 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5cgcf"] Jan 28 07:08:17 crc kubenswrapper[4776]: W0128 07:08:17.260669 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5180ed1_0d82_4c44_aed4_3f3a5b34af93.slice/crio-595c2365ffce3309df31267fa761d2f42a67961a567faa802ace558f1962b20c WatchSource:0}: Error finding container 595c2365ffce3309df31267fa761d2f42a67961a567faa802ace558f1962b20c: Status 404 returned error can't find the container with id 595c2365ffce3309df31267fa761d2f42a67961a567faa802ace558f1962b20c Jan 28 07:08:17 crc kubenswrapper[4776]: I0128 07:08:17.381732 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3acb133b-df04-4bed-bf37-fe0c45c08dc7" path="/var/lib/kubelet/pods/3acb133b-df04-4bed-bf37-fe0c45c08dc7/volumes" Jan 28 07:08:17 crc kubenswrapper[4776]: I0128 07:08:17.382521 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cb1a460-72c7-4fc9-9a41-f92d30d63444" path="/var/lib/kubelet/pods/4cb1a460-72c7-4fc9-9a41-f92d30d63444/volumes" Jan 28 07:08:17 crc kubenswrapper[4776]: I0128 07:08:17.383699 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="552e1ac3-1b3f-4480-820b-ee07c76efd64" path="/var/lib/kubelet/pods/552e1ac3-1b3f-4480-820b-ee07c76efd64/volumes" Jan 28 07:08:17 crc kubenswrapper[4776]: I0128 07:08:17.384268 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b" path="/var/lib/kubelet/pods/bc8a69eb-91af-4eab-8fdb-ac4ffba53b9b/volumes" Jan 28 07:08:17 crc kubenswrapper[4776]: I0128 07:08:17.823082 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 07:08:17 crc kubenswrapper[4776]: I0128 07:08:17.943792 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f4d58bd76-2fnlt"] Jan 28 07:08:17 crc kubenswrapper[4776]: W0128 07:08:17.959044 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5bc5428_35ca_44a5_8fa9_7d11ec4c6804.slice/crio-8acf1a481989aea13397267c3093efe2dd3358333ae599fef839a651142d1c9d WatchSource:0}: Error finding container 8acf1a481989aea13397267c3093efe2dd3358333ae599fef839a651142d1c9d: Status 404 returned error can't find the container with id 8acf1a481989aea13397267c3093efe2dd3358333ae599fef839a651142d1c9d Jan 28 07:08:17 crc kubenswrapper[4776]: I0128 07:08:17.959443 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-th8p7"] Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.117957 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5cgcf" event={"ID":"e1eea745-adc8-4e45-b52a-48190c7572b1","Type":"ContainerStarted","Data":"5597d77ece7eef3736c8b6f7018e9ea1484bc323b436166a843fa2f82c7a7899"} Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.118272 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5cgcf" event={"ID":"e1eea745-adc8-4e45-b52a-48190c7572b1","Type":"ContainerStarted","Data":"c4c69ff32d43b74cec9be0ee0a8323380c8e97aa6ad12ed9231d0382847ac431"} Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.125476 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f66fd5db5-sd687" event={"ID":"2c32e828-bea4-4a05-9492-31124e2964e1","Type":"ContainerStarted","Data":"6b34d9dafd9eee6f723fa4716e1157b53d5ae185f505afa8251c3edbe9e923aa"} Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.125518 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f66fd5db5-sd687" event={"ID":"2c32e828-bea4-4a05-9492-31124e2964e1","Type":"ContainerStarted","Data":"8931943c83fbf0139c973a9c37200ccae6895d638d5e1560eb0be4b1bb1fda3e"} Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.125508 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f66fd5db5-sd687" podUID="2c32e828-bea4-4a05-9492-31124e2964e1" containerName="horizon-log" containerID="cri-o://8931943c83fbf0139c973a9c37200ccae6895d638d5e1560eb0be4b1bb1fda3e" gracePeriod=30 Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.125683 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f66fd5db5-sd687" podUID="2c32e828-bea4-4a05-9492-31124e2964e1" containerName="horizon" containerID="cri-o://6b34d9dafd9eee6f723fa4716e1157b53d5ae185f505afa8251c3edbe9e923aa" gracePeriod=30 Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.130332 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b7d5d2aa-a358-4412-8668-444842b2bdc5","Type":"ContainerStarted","Data":"517a8fd914c0b490dae9bdddcce183872f3246515cfb13e0a30accc6d643d7d9"} Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.131940 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c7f79f5b8-2xn7l" event={"ID":"e5180ed1-0d82-4c44-aed4-3f3a5b34af93","Type":"ContainerStarted","Data":"97736af916b231252a9e5e9c2caf625fa76ea7906e2816fcdf86b6aad63ee7e3"} Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.131969 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c7f79f5b8-2xn7l" event={"ID":"e5180ed1-0d82-4c44-aed4-3f3a5b34af93","Type":"ContainerStarted","Data":"595c2365ffce3309df31267fa761d2f42a67961a567faa802ace558f1962b20c"} Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.133282 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801e91e9-f527-41e3-9468-9ec9e9ec8f3c","Type":"ContainerStarted","Data":"4426b54f84a837fd86eaba7c242d64b2a2c3ca6628dd96efd9d15a39607dad21"} Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.137611 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a","Type":"ContainerStarted","Data":"87d8b403c99f650b6ac90ddb2b994de9ac7d7b557d181ff16409c39fe90da324"} Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.158244 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-th8p7" event={"ID":"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804","Type":"ContainerStarted","Data":"8acf1a481989aea13397267c3093efe2dd3358333ae599fef839a651142d1c9d"} Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.160774 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5cgcf" podStartSLOduration=10.160755759 podStartE2EDuration="10.160755759s" podCreationTimestamp="2026-01-28 07:08:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:18.148421634 +0000 UTC m=+1069.564081794" watchObservedRunningTime="2026-01-28 07:08:18.160755759 +0000 UTC m=+1069.576415919" Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.164511 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f4d58bd76-2fnlt" event={"ID":"8fcdde4b-0742-40b3-8f98-41b218f6476a","Type":"ContainerStarted","Data":"e5c59fca3881f44d47d647801b6d60dc20440f97bab558f11dfac93f5c5e1684"} Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.171705 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-kqfn7" event={"ID":"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e","Type":"ContainerStarted","Data":"1cb617be81d77b7ccfab7ea704ef6632cf07a245eaad455c79329a67813a41cc"} Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.183903 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-755fdfc784-krn2x" event={"ID":"dc39478f-fee2-4eb1-89bc-789b5179a1ca","Type":"ContainerStarted","Data":"82a32bad66cb57ffc48bd7907ddc2a9dd3e64677e2292e1cf9572bdda60f617a"} Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.183940 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-755fdfc784-krn2x" event={"ID":"dc39478f-fee2-4eb1-89bc-789b5179a1ca","Type":"ContainerStarted","Data":"df76b18cc81f3929b45b9aca7f65380333ce137180f1c7e1ca7bbf463441ad6b"} Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.195780 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-f66fd5db5-sd687" podStartSLOduration=6.071377197 podStartE2EDuration="31.195700589s" podCreationTimestamp="2026-01-28 07:07:47 +0000 UTC" firstStartedPulling="2026-01-28 07:07:49.723457054 +0000 UTC m=+1041.139117214" lastFinishedPulling="2026-01-28 07:08:14.847780446 +0000 UTC m=+1066.263440606" observedRunningTime="2026-01-28 07:08:18.172278762 +0000 UTC m=+1069.587938922" watchObservedRunningTime="2026-01-28 07:08:18.195700589 +0000 UTC m=+1069.611360749" Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.209804 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-kqfn7" podStartSLOduration=4.24026663 podStartE2EDuration="56.209785301s" podCreationTimestamp="2026-01-28 07:07:22 +0000 UTC" firstStartedPulling="2026-01-28 07:07:24.255112684 +0000 UTC m=+1015.670772844" lastFinishedPulling="2026-01-28 07:08:16.224631365 +0000 UTC m=+1067.640291515" observedRunningTime="2026-01-28 07:08:18.206391988 +0000 UTC m=+1069.622052148" watchObservedRunningTime="2026-01-28 07:08:18.209785301 +0000 UTC m=+1069.625445461" Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.241041 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-755fdfc784-krn2x" podStartSLOduration=22.2410229 podStartE2EDuration="22.2410229s" podCreationTimestamp="2026-01-28 07:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:18.229120077 +0000 UTC m=+1069.644780237" watchObservedRunningTime="2026-01-28 07:08:18.2410229 +0000 UTC m=+1069.656683060" Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.518589 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.900683 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-659688b465-m49kr"] Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.903701 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.906021 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.906237 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.934698 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-659688b465-m49kr"] Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.994587 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-combined-ca-bundle\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.994650 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-httpd-config\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.994688 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-internal-tls-certs\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.994714 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-ovndb-tls-certs\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.994749 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-config\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.994770 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-public-tls-certs\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:18 crc kubenswrapper[4776]: I0128 07:08:18.994829 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf6gj\" (UniqueName: \"kubernetes.io/projected/f4598f22-8a4c-4050-a2ea-011675c33d1f-kube-api-access-wf6gj\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.097574 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-ovndb-tls-certs\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.097630 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-config\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.097654 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-public-tls-certs\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.097718 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf6gj\" (UniqueName: \"kubernetes.io/projected/f4598f22-8a4c-4050-a2ea-011675c33d1f-kube-api-access-wf6gj\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.097770 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-combined-ca-bundle\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.097807 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-httpd-config\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.097827 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-internal-tls-certs\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.113273 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-httpd-config\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.119071 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-public-tls-certs\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.120882 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-internal-tls-certs\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.121833 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-combined-ca-bundle\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.124209 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-ovndb-tls-certs\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.125986 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf6gj\" (UniqueName: \"kubernetes.io/projected/f4598f22-8a4c-4050-a2ea-011675c33d1f-kube-api-access-wf6gj\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.126102 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-config\") pod \"neutron-659688b465-m49kr\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.200297 4776 generic.go:334] "Generic (PLEG): container finished" podID="f5bc5428-35ca-44a5-8fa9-7d11ec4c6804" containerID="862ec626b77c87bc8620eef80543e4517a17df465156fb0f95c57df13ce63fb6" exitCode=0 Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.200458 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-th8p7" event={"ID":"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804","Type":"ContainerDied","Data":"862ec626b77c87bc8620eef80543e4517a17df465156fb0f95c57df13ce63fb6"} Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.216349 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c7f79f5b8-2xn7l" event={"ID":"e5180ed1-0d82-4c44-aed4-3f3a5b34af93","Type":"ContainerStarted","Data":"71bf08e3ca2d98d43e5538be797edda0f652fb4b5f58ef46a467d0df066fe22d"} Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.242685 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f4d58bd76-2fnlt" event={"ID":"8fcdde4b-0742-40b3-8f98-41b218f6476a","Type":"ContainerStarted","Data":"5a3df3dd7e59af0e67d5ea3d98b2294db6ff657761526999d623884d0cbb6796"} Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.243085 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f4d58bd76-2fnlt" event={"ID":"8fcdde4b-0742-40b3-8f98-41b218f6476a","Type":"ContainerStarted","Data":"45f1332b80b2ead5b8f1bd268273ede6bc21d2179b011b1bc40478a84c9a0db2"} Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.244588 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.249259 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.251202 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c7f79f5b8-2xn7l" podStartSLOduration=23.251179185 podStartE2EDuration="23.251179185s" podCreationTimestamp="2026-01-28 07:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:19.243433775 +0000 UTC m=+1070.659093935" watchObservedRunningTime="2026-01-28 07:08:19.251179185 +0000 UTC m=+1070.666839335" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.260038 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-c8whn" event={"ID":"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03","Type":"ContainerStarted","Data":"d2b7bda6eb34b7f0aafe525e62ba6f60edf67f90dfa6a837e836ddbe372eee32"} Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.271688 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b7d5d2aa-a358-4412-8668-444842b2bdc5","Type":"ContainerStarted","Data":"b5e56bd0be33523a64dd7f889e0d34604e29671e8c1d5c4ba1b566729e090a4d"} Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.273640 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7f4d58bd76-2fnlt" podStartSLOduration=3.273618065 podStartE2EDuration="3.273618065s" podCreationTimestamp="2026-01-28 07:08:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:19.264932299 +0000 UTC m=+1070.680592459" watchObservedRunningTime="2026-01-28 07:08:19.273618065 +0000 UTC m=+1070.689278225" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.288661 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-c8whn" podStartSLOduration=2.280139041 podStartE2EDuration="31.288645044s" podCreationTimestamp="2026-01-28 07:07:48 +0000 UTC" firstStartedPulling="2026-01-28 07:07:49.79320356 +0000 UTC m=+1041.208863710" lastFinishedPulling="2026-01-28 07:08:18.801709563 +0000 UTC m=+1070.217369713" observedRunningTime="2026-01-28 07:08:19.28373772 +0000 UTC m=+1070.699397880" watchObservedRunningTime="2026-01-28 07:08:19.288645044 +0000 UTC m=+1070.704305204" Jan 28 07:08:19 crc kubenswrapper[4776]: I0128 07:08:19.394185 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a","Type":"ContainerStarted","Data":"4c93005f1832027045b278fb28689f32912bd457d90ef31e7cacbe5bd6cd8b14"} Jan 28 07:08:20 crc kubenswrapper[4776]: I0128 07:08:20.185858 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-659688b465-m49kr"] Jan 28 07:08:20 crc kubenswrapper[4776]: I0128 07:08:20.365434 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b7d5d2aa-a358-4412-8668-444842b2bdc5","Type":"ContainerStarted","Data":"33cf145a97753123cfad01797a3b02d87459dfeda5a994cb916ae2b60c05f7d7"} Jan 28 07:08:20 crc kubenswrapper[4776]: I0128 07:08:20.376896 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-th8p7" event={"ID":"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804","Type":"ContainerStarted","Data":"daab897a06e28d9f56e0fcc5eaec130aa986a6718b293e9aa3748e1389645d10"} Jan 28 07:08:20 crc kubenswrapper[4776]: I0128 07:08:20.377672 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:20 crc kubenswrapper[4776]: I0128 07:08:20.381462 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-659688b465-m49kr" event={"ID":"f4598f22-8a4c-4050-a2ea-011675c33d1f","Type":"ContainerStarted","Data":"869dcbf82905f91282ba235e05da70b70179d65631d1fd86fd63d71cdb79694b"} Jan 28 07:08:20 crc kubenswrapper[4776]: I0128 07:08:20.403871 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=14.403855473 podStartE2EDuration="14.403855473s" podCreationTimestamp="2026-01-28 07:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:20.398463377 +0000 UTC m=+1071.814123537" watchObservedRunningTime="2026-01-28 07:08:20.403855473 +0000 UTC m=+1071.819515633" Jan 28 07:08:20 crc kubenswrapper[4776]: I0128 07:08:20.433324 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-th8p7" podStartSLOduration=4.433305583 podStartE2EDuration="4.433305583s" podCreationTimestamp="2026-01-28 07:08:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:20.426980771 +0000 UTC m=+1071.842640931" watchObservedRunningTime="2026-01-28 07:08:20.433305583 +0000 UTC m=+1071.848965743" Jan 28 07:08:21 crc kubenswrapper[4776]: I0128 07:08:21.393925 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-659688b465-m49kr" event={"ID":"f4598f22-8a4c-4050-a2ea-011675c33d1f","Type":"ContainerStarted","Data":"08d714467f191d11aa01f9db6668796c34c62f63f6c89edf434bc9040ffabc70"} Jan 28 07:08:21 crc kubenswrapper[4776]: I0128 07:08:21.395823 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a","Type":"ContainerStarted","Data":"d0d276b338d842190888892da687520404f2f083ff0b36044a1f5eb6dda63bb6"} Jan 28 07:08:21 crc kubenswrapper[4776]: I0128 07:08:21.416037 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.416008683 podStartE2EDuration="5.416008683s" podCreationTimestamp="2026-01-28 07:08:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:21.413348531 +0000 UTC m=+1072.829008691" watchObservedRunningTime="2026-01-28 07:08:21.416008683 +0000 UTC m=+1072.831668833" Jan 28 07:08:22 crc kubenswrapper[4776]: I0128 07:08:22.423044 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801e91e9-f527-41e3-9468-9ec9e9ec8f3c","Type":"ContainerStarted","Data":"43accac2de5072edeb03536a049dce90caf20f7eff3b337ec8fbdc4c6b66f285"} Jan 28 07:08:22 crc kubenswrapper[4776]: I0128 07:08:22.429760 4776 generic.go:334] "Generic (PLEG): container finished" podID="e1eea745-adc8-4e45-b52a-48190c7572b1" containerID="5597d77ece7eef3736c8b6f7018e9ea1484bc323b436166a843fa2f82c7a7899" exitCode=0 Jan 28 07:08:22 crc kubenswrapper[4776]: I0128 07:08:22.429840 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5cgcf" event={"ID":"e1eea745-adc8-4e45-b52a-48190c7572b1","Type":"ContainerDied","Data":"5597d77ece7eef3736c8b6f7018e9ea1484bc323b436166a843fa2f82c7a7899"} Jan 28 07:08:22 crc kubenswrapper[4776]: I0128 07:08:22.440156 4776 generic.go:334] "Generic (PLEG): container finished" podID="6728fb0a-d0b8-4fd0-970e-7e5e496ecd03" containerID="d2b7bda6eb34b7f0aafe525e62ba6f60edf67f90dfa6a837e836ddbe372eee32" exitCode=0 Jan 28 07:08:22 crc kubenswrapper[4776]: I0128 07:08:22.440280 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-c8whn" event={"ID":"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03","Type":"ContainerDied","Data":"d2b7bda6eb34b7f0aafe525e62ba6f60edf67f90dfa6a837e836ddbe372eee32"} Jan 28 07:08:22 crc kubenswrapper[4776]: I0128 07:08:22.448252 4776 generic.go:334] "Generic (PLEG): container finished" podID="92f68261-4c2f-49dd-84b6-ee2dbd1dc36e" containerID="1cb617be81d77b7ccfab7ea704ef6632cf07a245eaad455c79329a67813a41cc" exitCode=0 Jan 28 07:08:22 crc kubenswrapper[4776]: I0128 07:08:22.448358 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-kqfn7" event={"ID":"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e","Type":"ContainerDied","Data":"1cb617be81d77b7ccfab7ea704ef6632cf07a245eaad455c79329a67813a41cc"} Jan 28 07:08:22 crc kubenswrapper[4776]: I0128 07:08:22.464953 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-659688b465-m49kr" event={"ID":"f4598f22-8a4c-4050-a2ea-011675c33d1f","Type":"ContainerStarted","Data":"3679ed31d37acd7d8baab6f3f1b1e0764cb0eaed5bd02b59a650157b5400295a"} Jan 28 07:08:22 crc kubenswrapper[4776]: I0128 07:08:22.465420 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:22 crc kubenswrapper[4776]: I0128 07:08:22.536158 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-659688b465-m49kr" podStartSLOduration=4.536132827 podStartE2EDuration="4.536132827s" podCreationTimestamp="2026-01-28 07:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:22.517637824 +0000 UTC m=+1073.933297984" watchObservedRunningTime="2026-01-28 07:08:22.536132827 +0000 UTC m=+1073.951792977" Jan 28 07:08:26 crc kubenswrapper[4776]: I0128 07:08:26.798347 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 28 07:08:26 crc kubenswrapper[4776]: I0128 07:08:26.798942 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 28 07:08:26 crc kubenswrapper[4776]: I0128 07:08:26.823391 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:26 crc kubenswrapper[4776]: I0128 07:08:26.861865 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 28 07:08:26 crc kubenswrapper[4776]: I0128 07:08:26.883803 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 28 07:08:26 crc kubenswrapper[4776]: I0128 07:08:26.909439 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-w4jz5"] Jan 28 07:08:26 crc kubenswrapper[4776]: I0128 07:08:26.909850 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" podUID="d2d0e1a6-28c0-4e77-8066-38169bc1d083" containerName="dnsmasq-dns" containerID="cri-o://273dfe6a30465394f90b44084307024a84926067e74f4d612c2728375327b23a" gracePeriod=10 Jan 28 07:08:26 crc kubenswrapper[4776]: I0128 07:08:26.943536 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:08:26 crc kubenswrapper[4776]: I0128 07:08:26.944338 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:08:26 crc kubenswrapper[4776]: I0128 07:08:26.997541 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:08:26 crc kubenswrapper[4776]: I0128 07:08:26.998350 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.299211 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.299257 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.349303 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.357690 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.521269 4776 generic.go:334] "Generic (PLEG): container finished" podID="d2d0e1a6-28c0-4e77-8066-38169bc1d083" containerID="273dfe6a30465394f90b44084307024a84926067e74f4d612c2728375327b23a" exitCode=0 Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.521373 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" event={"ID":"d2d0e1a6-28c0-4e77-8066-38169bc1d083","Type":"ContainerDied","Data":"273dfe6a30465394f90b44084307024a84926067e74f4d612c2728375327b23a"} Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.522175 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.522228 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.522241 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.522251 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.852038 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.893133 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrcmh\" (UniqueName: \"kubernetes.io/projected/e1eea745-adc8-4e45-b52a-48190c7572b1-kube-api-access-qrcmh\") pod \"e1eea745-adc8-4e45-b52a-48190c7572b1\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.893281 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-combined-ca-bundle\") pod \"e1eea745-adc8-4e45-b52a-48190c7572b1\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.893330 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-credential-keys\") pod \"e1eea745-adc8-4e45-b52a-48190c7572b1\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.893393 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-scripts\") pod \"e1eea745-adc8-4e45-b52a-48190c7572b1\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.893463 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-config-data\") pod \"e1eea745-adc8-4e45-b52a-48190c7572b1\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.893500 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-fernet-keys\") pod \"e1eea745-adc8-4e45-b52a-48190c7572b1\" (UID: \"e1eea745-adc8-4e45-b52a-48190c7572b1\") " Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.902868 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-scripts" (OuterVolumeSpecName: "scripts") pod "e1eea745-adc8-4e45-b52a-48190c7572b1" (UID: "e1eea745-adc8-4e45-b52a-48190c7572b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.904203 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e1eea745-adc8-4e45-b52a-48190c7572b1" (UID: "e1eea745-adc8-4e45-b52a-48190c7572b1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.904518 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1eea745-adc8-4e45-b52a-48190c7572b1-kube-api-access-qrcmh" (OuterVolumeSpecName: "kube-api-access-qrcmh") pod "e1eea745-adc8-4e45-b52a-48190c7572b1" (UID: "e1eea745-adc8-4e45-b52a-48190c7572b1"). InnerVolumeSpecName "kube-api-access-qrcmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.907646 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e1eea745-adc8-4e45-b52a-48190c7572b1" (UID: "e1eea745-adc8-4e45-b52a-48190c7572b1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.939662 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-config-data" (OuterVolumeSpecName: "config-data") pod "e1eea745-adc8-4e45-b52a-48190c7572b1" (UID: "e1eea745-adc8-4e45-b52a-48190c7572b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.950633 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1eea745-adc8-4e45-b52a-48190c7572b1" (UID: "e1eea745-adc8-4e45-b52a-48190c7572b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.996889 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.996937 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.996952 4776 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.996964 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrcmh\" (UniqueName: \"kubernetes.io/projected/e1eea745-adc8-4e45-b52a-48190c7572b1-kube-api-access-qrcmh\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.996978 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:27 crc kubenswrapper[4776]: I0128 07:08:27.996993 4776 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e1eea745-adc8-4e45-b52a-48190c7572b1-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:28 crc kubenswrapper[4776]: I0128 07:08:28.533211 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5cgcf" Jan 28 07:08:28 crc kubenswrapper[4776]: I0128 07:08:28.533504 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5cgcf" event={"ID":"e1eea745-adc8-4e45-b52a-48190c7572b1","Type":"ContainerDied","Data":"c4c69ff32d43b74cec9be0ee0a8323380c8e97aa6ad12ed9231d0382847ac431"} Jan 28 07:08:28 crc kubenswrapper[4776]: I0128 07:08:28.533592 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4c69ff32d43b74cec9be0ee0a8323380c8e97aa6ad12ed9231d0382847ac431" Jan 28 07:08:28 crc kubenswrapper[4776]: I0128 07:08:28.941728 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-979f97b77-x2lng"] Jan 28 07:08:28 crc kubenswrapper[4776]: E0128 07:08:28.942118 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1eea745-adc8-4e45-b52a-48190c7572b1" containerName="keystone-bootstrap" Jan 28 07:08:28 crc kubenswrapper[4776]: I0128 07:08:28.942133 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1eea745-adc8-4e45-b52a-48190c7572b1" containerName="keystone-bootstrap" Jan 28 07:08:28 crc kubenswrapper[4776]: I0128 07:08:28.942340 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1eea745-adc8-4e45-b52a-48190c7572b1" containerName="keystone-bootstrap" Jan 28 07:08:28 crc kubenswrapper[4776]: I0128 07:08:28.942960 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:28 crc kubenswrapper[4776]: I0128 07:08:28.950731 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 28 07:08:28 crc kubenswrapper[4776]: I0128 07:08:28.991762 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 28 07:08:28 crc kubenswrapper[4776]: I0128 07:08:28.992094 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 28 07:08:28 crc kubenswrapper[4776]: I0128 07:08:28.992682 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 28 07:08:28 crc kubenswrapper[4776]: I0128 07:08:28.992873 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 28 07:08:28 crc kubenswrapper[4776]: I0128 07:08:28.993592 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-52vd7" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.021147 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-979f97b77-x2lng"] Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.126208 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-scripts\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.126263 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-credential-keys\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.126323 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5zqb\" (UniqueName: \"kubernetes.io/projected/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-kube-api-access-k5zqb\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.126355 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-combined-ca-bundle\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.126379 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-public-tls-certs\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.126480 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-config-data\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.126509 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-fernet-keys\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.126529 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-internal-tls-certs\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.128072 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" podUID="d2d0e1a6-28c0-4e77-8066-38169bc1d083" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.156:5353: connect: connection refused" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.230028 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-combined-ca-bundle\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.231039 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-public-tls-certs\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.231591 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-config-data\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.231633 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-fernet-keys\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.231662 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-internal-tls-certs\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.232045 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-scripts\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.232093 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-credential-keys\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.232127 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5zqb\" (UniqueName: \"kubernetes.io/projected/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-kube-api-access-k5zqb\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.235203 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-combined-ca-bundle\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.239193 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-config-data\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.241341 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-fernet-keys\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.245135 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-credential-keys\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.246126 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-internal-tls-certs\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.246799 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-scripts\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.247352 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-public-tls-certs\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.250971 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5zqb\" (UniqueName: \"kubernetes.io/projected/9d7ed0f7-4d79-42b7-8f0d-805e6994e958-kube-api-access-k5zqb\") pod \"keystone-979f97b77-x2lng\" (UID: \"9d7ed0f7-4d79-42b7-8f0d-805e6994e958\") " pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.327723 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-52vd7" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.330513 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.637308 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.637417 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 07:08:29 crc kubenswrapper[4776]: I0128 07:08:29.902127 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.192201 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.192336 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.194929 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.563400 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-kqfn7" event={"ID":"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e","Type":"ContainerDied","Data":"125a1cd0e1f5d11cf2d1654770399434723ddb4ac31b391bc6f08c080d3966c4"} Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.563730 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="125a1cd0e1f5d11cf2d1654770399434723ddb4ac31b391bc6f08c080d3966c4" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.574655 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-c8whn" event={"ID":"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03","Type":"ContainerDied","Data":"302ba9a3012f6b9b76df2cf4b399c0405f6f5050f77a060c1ffa4f4b86712c27"} Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.574691 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="302ba9a3012f6b9b76df2cf4b399c0405f6f5050f77a060c1ffa4f4b86712c27" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.682433 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-c8whn" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.764010 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-kqfn7" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.775294 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-scripts\") pod \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.775377 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-config-data\") pod \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.775425 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-logs\") pod \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.775457 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-combined-ca-bundle\") pod \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.775512 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmmfj\" (UniqueName: \"kubernetes.io/projected/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-kube-api-access-tmmfj\") pod \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\" (UID: \"6728fb0a-d0b8-4fd0-970e-7e5e496ecd03\") " Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.777866 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-logs" (OuterVolumeSpecName: "logs") pod "6728fb0a-d0b8-4fd0-970e-7e5e496ecd03" (UID: "6728fb0a-d0b8-4fd0-970e-7e5e496ecd03"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.794421 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-kube-api-access-tmmfj" (OuterVolumeSpecName: "kube-api-access-tmmfj") pod "6728fb0a-d0b8-4fd0-970e-7e5e496ecd03" (UID: "6728fb0a-d0b8-4fd0-970e-7e5e496ecd03"). InnerVolumeSpecName "kube-api-access-tmmfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.818987 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-scripts" (OuterVolumeSpecName: "scripts") pod "6728fb0a-d0b8-4fd0-970e-7e5e496ecd03" (UID: "6728fb0a-d0b8-4fd0-970e-7e5e496ecd03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.865650 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-config-data" (OuterVolumeSpecName: "config-data") pod "6728fb0a-d0b8-4fd0-970e-7e5e496ecd03" (UID: "6728fb0a-d0b8-4fd0-970e-7e5e496ecd03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.877301 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-config-data\") pod \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\" (UID: \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\") " Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.877364 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xz29\" (UniqueName: \"kubernetes.io/projected/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-kube-api-access-9xz29\") pod \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\" (UID: \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\") " Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.877394 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-combined-ca-bundle\") pod \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\" (UID: \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\") " Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.877470 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-db-sync-config-data\") pod \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\" (UID: \"92f68261-4c2f-49dd-84b6-ee2dbd1dc36e\") " Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.877992 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.878015 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.878027 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.878036 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmmfj\" (UniqueName: \"kubernetes.io/projected/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-kube-api-access-tmmfj\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.889670 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6728fb0a-d0b8-4fd0-970e-7e5e496ecd03" (UID: "6728fb0a-d0b8-4fd0-970e-7e5e496ecd03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.889709 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "92f68261-4c2f-49dd-84b6-ee2dbd1dc36e" (UID: "92f68261-4c2f-49dd-84b6-ee2dbd1dc36e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.890347 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-kube-api-access-9xz29" (OuterVolumeSpecName: "kube-api-access-9xz29") pod "92f68261-4c2f-49dd-84b6-ee2dbd1dc36e" (UID: "92f68261-4c2f-49dd-84b6-ee2dbd1dc36e"). InnerVolumeSpecName "kube-api-access-9xz29". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.980274 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xz29\" (UniqueName: \"kubernetes.io/projected/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-kube-api-access-9xz29\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.980642 4776 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:30 crc kubenswrapper[4776]: I0128 07:08:30.980656 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.027095 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.039827 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92f68261-4c2f-49dd-84b6-ee2dbd1dc36e" (UID: "92f68261-4c2f-49dd-84b6-ee2dbd1dc36e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.089914 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-dns-svc\") pod \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.090085 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-ovsdbserver-sb\") pod \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.090112 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wbmj\" (UniqueName: \"kubernetes.io/projected/d2d0e1a6-28c0-4e77-8066-38169bc1d083-kube-api-access-2wbmj\") pod \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.090132 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-config\") pod \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.090195 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-ovsdbserver-nb\") pod \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.090244 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-dns-swift-storage-0\") pod \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\" (UID: \"d2d0e1a6-28c0-4e77-8066-38169bc1d083\") " Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.092008 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.127683 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d0e1a6-28c0-4e77-8066-38169bc1d083-kube-api-access-2wbmj" (OuterVolumeSpecName: "kube-api-access-2wbmj") pod "d2d0e1a6-28c0-4e77-8066-38169bc1d083" (UID: "d2d0e1a6-28c0-4e77-8066-38169bc1d083"). InnerVolumeSpecName "kube-api-access-2wbmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.127782 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-config-data" (OuterVolumeSpecName: "config-data") pod "92f68261-4c2f-49dd-84b6-ee2dbd1dc36e" (UID: "92f68261-4c2f-49dd-84b6-ee2dbd1dc36e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.167183 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-979f97b77-x2lng"] Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.184288 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d2d0e1a6-28c0-4e77-8066-38169bc1d083" (UID: "d2d0e1a6-28c0-4e77-8066-38169bc1d083"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.187282 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d2d0e1a6-28c0-4e77-8066-38169bc1d083" (UID: "d2d0e1a6-28c0-4e77-8066-38169bc1d083"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.191764 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-config" (OuterVolumeSpecName: "config") pod "d2d0e1a6-28c0-4e77-8066-38169bc1d083" (UID: "d2d0e1a6-28c0-4e77-8066-38169bc1d083"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.193406 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wbmj\" (UniqueName: \"kubernetes.io/projected/d2d0e1a6-28c0-4e77-8066-38169bc1d083-kube-api-access-2wbmj\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.193434 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.193443 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.193452 4776 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.193460 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.193942 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d2d0e1a6-28c0-4e77-8066-38169bc1d083" (UID: "d2d0e1a6-28c0-4e77-8066-38169bc1d083"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.198965 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d2d0e1a6-28c0-4e77-8066-38169bc1d083" (UID: "d2d0e1a6-28c0-4e77-8066-38169bc1d083"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.297617 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.297651 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2d0e1a6-28c0-4e77-8066-38169bc1d083-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.589314 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" event={"ID":"d2d0e1a6-28c0-4e77-8066-38169bc1d083","Type":"ContainerDied","Data":"5780caa695f64edd9d5bc07f68b8c80ea54493aceba9cf20630861762f535eb5"} Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.590014 4776 scope.go:117] "RemoveContainer" containerID="273dfe6a30465394f90b44084307024a84926067e74f4d612c2728375327b23a" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.589329 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-w4jz5" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.593472 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801e91e9-f527-41e3-9468-9ec9e9ec8f3c","Type":"ContainerStarted","Data":"d6233f5f0aa104656ea27391cc344e2986118f551562125632b837160e46818c"} Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.608101 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fg2nt" event={"ID":"6377f80e-0b32-479e-b33c-fc4d9f67b299","Type":"ContainerStarted","Data":"a65362a05306196ca9edebfae4fe8396e76494779bb8a4cfe59d0edcaa6fc49e"} Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.611152 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-979f97b77-x2lng" event={"ID":"9d7ed0f7-4d79-42b7-8f0d-805e6994e958","Type":"ContainerStarted","Data":"45715d032583c8b5f5435ae4397e1debd18d1bebb6a0b1c09429266243a59530"} Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.611186 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-c8whn" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.612095 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-kqfn7" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.636789 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-fg2nt" podStartSLOduration=2.900538735 podStartE2EDuration="43.636771089s" podCreationTimestamp="2026-01-28 07:07:48 +0000 UTC" firstStartedPulling="2026-01-28 07:07:49.921573117 +0000 UTC m=+1041.337233277" lastFinishedPulling="2026-01-28 07:08:30.657805471 +0000 UTC m=+1082.073465631" observedRunningTime="2026-01-28 07:08:31.62538701 +0000 UTC m=+1083.041047190" watchObservedRunningTime="2026-01-28 07:08:31.636771089 +0000 UTC m=+1083.052431249" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.643575 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-979f97b77-x2lng" podStartSLOduration=3.643561994 podStartE2EDuration="3.643561994s" podCreationTimestamp="2026-01-28 07:08:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:31.642431383 +0000 UTC m=+1083.058091543" watchObservedRunningTime="2026-01-28 07:08:31.643561994 +0000 UTC m=+1083.059222154" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.674082 4776 scope.go:117] "RemoveContainer" containerID="eb8f1401a023c1ab82f1f67d2dc41bfdaca095988575390a18f225c0c94f6c06" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.676609 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-w4jz5"] Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.694607 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-w4jz5"] Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.831446 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-65b6d456f6-wvlnv"] Jan 28 07:08:31 crc kubenswrapper[4776]: E0128 07:08:31.831915 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6728fb0a-d0b8-4fd0-970e-7e5e496ecd03" containerName="placement-db-sync" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.831936 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6728fb0a-d0b8-4fd0-970e-7e5e496ecd03" containerName="placement-db-sync" Jan 28 07:08:31 crc kubenswrapper[4776]: E0128 07:08:31.831961 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92f68261-4c2f-49dd-84b6-ee2dbd1dc36e" containerName="watcher-db-sync" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.831970 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f68261-4c2f-49dd-84b6-ee2dbd1dc36e" containerName="watcher-db-sync" Jan 28 07:08:31 crc kubenswrapper[4776]: E0128 07:08:31.831992 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d0e1a6-28c0-4e77-8066-38169bc1d083" containerName="dnsmasq-dns" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.831999 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d0e1a6-28c0-4e77-8066-38169bc1d083" containerName="dnsmasq-dns" Jan 28 07:08:31 crc kubenswrapper[4776]: E0128 07:08:31.832012 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d0e1a6-28c0-4e77-8066-38169bc1d083" containerName="init" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.832021 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d0e1a6-28c0-4e77-8066-38169bc1d083" containerName="init" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.832263 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6728fb0a-d0b8-4fd0-970e-7e5e496ecd03" containerName="placement-db-sync" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.832301 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2d0e1a6-28c0-4e77-8066-38169bc1d083" containerName="dnsmasq-dns" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.832317 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="92f68261-4c2f-49dd-84b6-ee2dbd1dc36e" containerName="watcher-db-sync" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.833904 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.842526 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.842725 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-b9w9l" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.842952 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.843115 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.843217 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.843673 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-65b6d456f6-wvlnv"] Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.918087 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388a79ff-9e00-4cc1-a935-20a9b00402a8-config-data\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.918144 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbzsk\" (UniqueName: \"kubernetes.io/projected/388a79ff-9e00-4cc1-a935-20a9b00402a8-kube-api-access-sbzsk\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.918184 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388a79ff-9e00-4cc1-a935-20a9b00402a8-logs\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.918216 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/388a79ff-9e00-4cc1-a935-20a9b00402a8-internal-tls-certs\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.918249 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388a79ff-9e00-4cc1-a935-20a9b00402a8-scripts\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.918378 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/388a79ff-9e00-4cc1-a935-20a9b00402a8-public-tls-certs\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:31 crc kubenswrapper[4776]: I0128 07:08:31.918477 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388a79ff-9e00-4cc1-a935-20a9b00402a8-combined-ca-bundle\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.020450 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388a79ff-9e00-4cc1-a935-20a9b00402a8-logs\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.020847 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/388a79ff-9e00-4cc1-a935-20a9b00402a8-internal-tls-certs\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.020926 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388a79ff-9e00-4cc1-a935-20a9b00402a8-scripts\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.020956 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/388a79ff-9e00-4cc1-a935-20a9b00402a8-public-tls-certs\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.021023 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388a79ff-9e00-4cc1-a935-20a9b00402a8-logs\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.021808 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388a79ff-9e00-4cc1-a935-20a9b00402a8-combined-ca-bundle\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.021884 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388a79ff-9e00-4cc1-a935-20a9b00402a8-config-data\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.021932 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbzsk\" (UniqueName: \"kubernetes.io/projected/388a79ff-9e00-4cc1-a935-20a9b00402a8-kube-api-access-sbzsk\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.035326 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/388a79ff-9e00-4cc1-a935-20a9b00402a8-internal-tls-certs\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.042047 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/388a79ff-9e00-4cc1-a935-20a9b00402a8-public-tls-certs\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.055514 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388a79ff-9e00-4cc1-a935-20a9b00402a8-config-data\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.056026 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388a79ff-9e00-4cc1-a935-20a9b00402a8-combined-ca-bundle\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.056126 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388a79ff-9e00-4cc1-a935-20a9b00402a8-scripts\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.056608 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.057979 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.059504 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbzsk\" (UniqueName: \"kubernetes.io/projected/388a79ff-9e00-4cc1-a935-20a9b00402a8-kube-api-access-sbzsk\") pod \"placement-65b6d456f6-wvlnv\" (UID: \"388a79ff-9e00-4cc1-a935-20a9b00402a8\") " pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.070674 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.071449 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-hcstr" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.100617 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.126481 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " pod="openstack/watcher-api-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.126518 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " pod="openstack/watcher-api-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.126597 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znzrv\" (UniqueName: \"kubernetes.io/projected/09beb1cc-598c-406f-8121-acb8ece8e21c-kube-api-access-znzrv\") pod \"watcher-api-0\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " pod="openstack/watcher-api-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.126640 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-config-data\") pod \"watcher-api-0\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " pod="openstack/watcher-api-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.126689 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09beb1cc-598c-406f-8121-acb8ece8e21c-logs\") pod \"watcher-api-0\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " pod="openstack/watcher-api-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.135876 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.137079 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.142652 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.160713 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.201923 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.236107 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.241777 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.244889 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.246794 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.246918 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09beb1cc-598c-406f-8121-acb8ece8e21c-logs\") pod \"watcher-api-0\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " pod="openstack/watcher-api-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.247045 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " pod="openstack/watcher-api-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.247073 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " pod="openstack/watcher-api-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.247101 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g98tl\" (UniqueName: \"kubernetes.io/projected/2ba7ddaa-338f-46aa-9609-609740f34cb7-kube-api-access-g98tl\") pod \"watcher-decision-engine-0\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.247134 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ba7ddaa-338f-46aa-9609-609740f34cb7-logs\") pod \"watcher-decision-engine-0\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.247208 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.247251 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znzrv\" (UniqueName: \"kubernetes.io/projected/09beb1cc-598c-406f-8121-acb8ece8e21c-kube-api-access-znzrv\") pod \"watcher-api-0\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " pod="openstack/watcher-api-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.260568 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " pod="openstack/watcher-api-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.265963 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09beb1cc-598c-406f-8121-acb8ece8e21c-logs\") pod \"watcher-api-0\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " pod="openstack/watcher-api-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.269020 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-config-data\") pod \"watcher-decision-engine-0\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.269095 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-config-data\") pod \"watcher-api-0\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " pod="openstack/watcher-api-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.275635 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " pod="openstack/watcher-api-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.278849 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-config-data\") pod \"watcher-api-0\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " pod="openstack/watcher-api-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.305815 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.333936 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f4d58bd76-2fnlt"] Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.334227 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f4d58bd76-2fnlt" podUID="8fcdde4b-0742-40b3-8f98-41b218f6476a" containerName="neutron-api" containerID="cri-o://45f1332b80b2ead5b8f1bd268273ede6bc21d2179b011b1bc40478a84c9a0db2" gracePeriod=30 Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.336338 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f4d58bd76-2fnlt" podUID="8fcdde4b-0742-40b3-8f98-41b218f6476a" containerName="neutron-httpd" containerID="cri-o://5a3df3dd7e59af0e67d5ea3d98b2294db6ff657761526999d623884d0cbb6796" gracePeriod=30 Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.337330 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znzrv\" (UniqueName: \"kubernetes.io/projected/09beb1cc-598c-406f-8121-acb8ece8e21c-kube-api-access-znzrv\") pod \"watcher-api-0\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " pod="openstack/watcher-api-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.354900 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7967c58c5f-jkrnc"] Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.356676 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.370618 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g98tl\" (UniqueName: \"kubernetes.io/projected/2ba7ddaa-338f-46aa-9609-609740f34cb7-kube-api-access-g98tl\") pod \"watcher-decision-engine-0\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.370668 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0bb7f08-c9fa-4595-9b5e-b80ff3821169-logs\") pod \"watcher-applier-0\" (UID: \"e0bb7f08-c9fa-4595-9b5e-b80ff3821169\") " pod="openstack/watcher-applier-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.370691 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ba7ddaa-338f-46aa-9609-609740f34cb7-logs\") pod \"watcher-decision-engine-0\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.370743 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c86mt\" (UniqueName: \"kubernetes.io/projected/e0bb7f08-c9fa-4595-9b5e-b80ff3821169-kube-api-access-c86mt\") pod \"watcher-applier-0\" (UID: \"e0bb7f08-c9fa-4595-9b5e-b80ff3821169\") " pod="openstack/watcher-applier-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.370772 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.370816 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-config-data\") pod \"watcher-decision-engine-0\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.370849 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0bb7f08-c9fa-4595-9b5e-b80ff3821169-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"e0bb7f08-c9fa-4595-9b5e-b80ff3821169\") " pod="openstack/watcher-applier-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.370871 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.370888 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0bb7f08-c9fa-4595-9b5e-b80ff3821169-config-data\") pod \"watcher-applier-0\" (UID: \"e0bb7f08-c9fa-4595-9b5e-b80ff3821169\") " pod="openstack/watcher-applier-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.371130 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ba7ddaa-338f-46aa-9609-609740f34cb7-logs\") pod \"watcher-decision-engine-0\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.378447 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-config-data\") pod \"watcher-decision-engine-0\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.384104 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.386109 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.441610 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7967c58c5f-jkrnc"] Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.442305 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g98tl\" (UniqueName: \"kubernetes.io/projected/2ba7ddaa-338f-46aa-9609-609740f34cb7-kube-api-access-g98tl\") pod \"watcher-decision-engine-0\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.475646 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-httpd-config\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.475734 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0bb7f08-c9fa-4595-9b5e-b80ff3821169-logs\") pod \"watcher-applier-0\" (UID: \"e0bb7f08-c9fa-4595-9b5e-b80ff3821169\") " pod="openstack/watcher-applier-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.475770 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c86mt\" (UniqueName: \"kubernetes.io/projected/e0bb7f08-c9fa-4595-9b5e-b80ff3821169-kube-api-access-c86mt\") pod \"watcher-applier-0\" (UID: \"e0bb7f08-c9fa-4595-9b5e-b80ff3821169\") " pod="openstack/watcher-applier-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.475794 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-config\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.475820 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv5fz\" (UniqueName: \"kubernetes.io/projected/502a66df-cc30-46b3-98d4-d056d3497547-kube-api-access-fv5fz\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.475859 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-ovndb-tls-certs\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.475908 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0bb7f08-c9fa-4595-9b5e-b80ff3821169-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"e0bb7f08-c9fa-4595-9b5e-b80ff3821169\") " pod="openstack/watcher-applier-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.475934 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0bb7f08-c9fa-4595-9b5e-b80ff3821169-config-data\") pod \"watcher-applier-0\" (UID: \"e0bb7f08-c9fa-4595-9b5e-b80ff3821169\") " pod="openstack/watcher-applier-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.475955 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-public-tls-certs\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.475977 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-combined-ca-bundle\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.475998 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-internal-tls-certs\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.476396 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0bb7f08-c9fa-4595-9b5e-b80ff3821169-logs\") pod \"watcher-applier-0\" (UID: \"e0bb7f08-c9fa-4595-9b5e-b80ff3821169\") " pod="openstack/watcher-applier-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.489192 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0bb7f08-c9fa-4595-9b5e-b80ff3821169-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"e0bb7f08-c9fa-4595-9b5e-b80ff3821169\") " pod="openstack/watcher-applier-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.494388 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.514415 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c86mt\" (UniqueName: \"kubernetes.io/projected/e0bb7f08-c9fa-4595-9b5e-b80ff3821169-kube-api-access-c86mt\") pod \"watcher-applier-0\" (UID: \"e0bb7f08-c9fa-4595-9b5e-b80ff3821169\") " pod="openstack/watcher-applier-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.515684 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0bb7f08-c9fa-4595-9b5e-b80ff3821169-config-data\") pod \"watcher-applier-0\" (UID: \"e0bb7f08-c9fa-4595-9b5e-b80ff3821169\") " pod="openstack/watcher-applier-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.520533 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.577409 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-ovndb-tls-certs\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.577488 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-public-tls-certs\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.577511 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-combined-ca-bundle\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.577530 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-internal-tls-certs\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.577617 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-httpd-config\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.577672 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-config\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.577695 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv5fz\" (UniqueName: \"kubernetes.io/projected/502a66df-cc30-46b3-98d4-d056d3497547-kube-api-access-fv5fz\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.586190 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-public-tls-certs\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.587878 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-httpd-config\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.587322 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-combined-ca-bundle\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.592245 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-ovndb-tls-certs\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.608870 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.610235 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-config\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.610774 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/502a66df-cc30-46b3-98d4-d056d3497547-internal-tls-certs\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.613220 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.658356 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv5fz\" (UniqueName: \"kubernetes.io/projected/502a66df-cc30-46b3-98d4-d056d3497547-kube-api-access-fv5fz\") pod \"neutron-7967c58c5f-jkrnc\" (UID: \"502a66df-cc30-46b3-98d4-d056d3497547\") " pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.702359 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.779198 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zxp7l" event={"ID":"2c1400af-1c32-4f74-89f8-30b42dbb6c91","Type":"ContainerStarted","Data":"e067161ef4269cdad93881fc154e79b54eb2dd197ccd2d0bc4ef018370a84f91"} Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.786325 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-979f97b77-x2lng" event={"ID":"9d7ed0f7-4d79-42b7-8f0d-805e6994e958","Type":"ContainerStarted","Data":"78454a043ab448da73f961debc05e496cb9e70405616f2760de26af235bb1c58"} Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.787350 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:08:32 crc kubenswrapper[4776]: I0128 07:08:32.823190 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-zxp7l" podStartSLOduration=4.649600838 podStartE2EDuration="45.823170133s" podCreationTimestamp="2026-01-28 07:07:47 +0000 UTC" firstStartedPulling="2026-01-28 07:07:49.694458757 +0000 UTC m=+1041.110118917" lastFinishedPulling="2026-01-28 07:08:30.868028052 +0000 UTC m=+1082.283688212" observedRunningTime="2026-01-28 07:08:32.802319627 +0000 UTC m=+1084.217979787" watchObservedRunningTime="2026-01-28 07:08:32.823170133 +0000 UTC m=+1084.238830293" Jan 28 07:08:33 crc kubenswrapper[4776]: I0128 07:08:33.059792 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-65b6d456f6-wvlnv"] Jan 28 07:08:33 crc kubenswrapper[4776]: I0128 07:08:33.264611 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Jan 28 07:08:33 crc kubenswrapper[4776]: I0128 07:08:33.335135 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2d0e1a6-28c0-4e77-8066-38169bc1d083" path="/var/lib/kubelet/pods/d2d0e1a6-28c0-4e77-8066-38169bc1d083/volumes" Jan 28 07:08:33 crc kubenswrapper[4776]: I0128 07:08:33.399276 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 28 07:08:33 crc kubenswrapper[4776]: I0128 07:08:33.422974 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Jan 28 07:08:33 crc kubenswrapper[4776]: I0128 07:08:33.496650 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7967c58c5f-jkrnc"] Jan 28 07:08:33 crc kubenswrapper[4776]: I0128 07:08:33.806900 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"09beb1cc-598c-406f-8121-acb8ece8e21c","Type":"ContainerStarted","Data":"dbfc812e8fa79f252fc1fc83420dc31b45a07763edeb72e746272e47346b0043"} Jan 28 07:08:33 crc kubenswrapper[4776]: I0128 07:08:33.810207 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"2ba7ddaa-338f-46aa-9609-609740f34cb7","Type":"ContainerStarted","Data":"d49e48162f51c6b7ea8eb71889bc72686552dd9ab8bf8ace2c33c1fd5718bf12"} Jan 28 07:08:33 crc kubenswrapper[4776]: I0128 07:08:33.812739 4776 generic.go:334] "Generic (PLEG): container finished" podID="8fcdde4b-0742-40b3-8f98-41b218f6476a" containerID="5a3df3dd7e59af0e67d5ea3d98b2294db6ff657761526999d623884d0cbb6796" exitCode=0 Jan 28 07:08:33 crc kubenswrapper[4776]: I0128 07:08:33.812784 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f4d58bd76-2fnlt" event={"ID":"8fcdde4b-0742-40b3-8f98-41b218f6476a","Type":"ContainerDied","Data":"5a3df3dd7e59af0e67d5ea3d98b2294db6ff657761526999d623884d0cbb6796"} Jan 28 07:08:33 crc kubenswrapper[4776]: I0128 07:08:33.819108 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65b6d456f6-wvlnv" event={"ID":"388a79ff-9e00-4cc1-a935-20a9b00402a8","Type":"ContainerStarted","Data":"9d89207e039f97d1fba655e15887a29a43e30c7bc0c95bfc146e91af877d2d29"} Jan 28 07:08:33 crc kubenswrapper[4776]: I0128 07:08:33.819161 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65b6d456f6-wvlnv" event={"ID":"388a79ff-9e00-4cc1-a935-20a9b00402a8","Type":"ContainerStarted","Data":"0af64b978d926f6b37c5f0d359fd4d20e2c5889e0ae6e278d941c9c62df99fba"} Jan 28 07:08:33 crc kubenswrapper[4776]: I0128 07:08:33.820188 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7967c58c5f-jkrnc" event={"ID":"502a66df-cc30-46b3-98d4-d056d3497547","Type":"ContainerStarted","Data":"24241cf801fcc058ec9f92de6012b8d2f4161abaf50f8bee502d2f522c8feeb7"} Jan 28 07:08:33 crc kubenswrapper[4776]: I0128 07:08:33.823745 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"e0bb7f08-c9fa-4595-9b5e-b80ff3821169","Type":"ContainerStarted","Data":"3b73048963b2506b523de4e7fd57dccfe76ce6e6500f3084445ba2de092add95"} Jan 28 07:08:34 crc kubenswrapper[4776]: I0128 07:08:34.837667 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65b6d456f6-wvlnv" event={"ID":"388a79ff-9e00-4cc1-a935-20a9b00402a8","Type":"ContainerStarted","Data":"2dd1114315bb3bf9681941afc5abde739ba0cf561f9375729e253b1a3cc47961"} Jan 28 07:08:34 crc kubenswrapper[4776]: I0128 07:08:34.837974 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:34 crc kubenswrapper[4776]: I0128 07:08:34.837989 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:08:34 crc kubenswrapper[4776]: I0128 07:08:34.842498 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7967c58c5f-jkrnc" event={"ID":"502a66df-cc30-46b3-98d4-d056d3497547","Type":"ContainerStarted","Data":"de2877e0678d709f2430422f426b25b1f62269aaa46b92dc33e166de8aeeaad1"} Jan 28 07:08:34 crc kubenswrapper[4776]: I0128 07:08:34.842525 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7967c58c5f-jkrnc" event={"ID":"502a66df-cc30-46b3-98d4-d056d3497547","Type":"ContainerStarted","Data":"35ef924378392f0a318f00e641264acde11e174ba94cd7904a45a78126b13976"} Jan 28 07:08:34 crc kubenswrapper[4776]: I0128 07:08:34.843083 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:08:34 crc kubenswrapper[4776]: I0128 07:08:34.846943 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"09beb1cc-598c-406f-8121-acb8ece8e21c","Type":"ContainerStarted","Data":"c6b1de7ac5d848c3dbeff5c9dcea6a3f2dc78af3d0c41b8d06be4d99eb4d3e70"} Jan 28 07:08:34 crc kubenswrapper[4776]: I0128 07:08:34.846965 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"09beb1cc-598c-406f-8121-acb8ece8e21c","Type":"ContainerStarted","Data":"6277e5a11cbe6bb596bb46f6e9669ea8ebcd658adf9df5f05f5e6a09a859fab7"} Jan 28 07:08:34 crc kubenswrapper[4776]: I0128 07:08:34.847284 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Jan 28 07:08:34 crc kubenswrapper[4776]: I0128 07:08:34.878388 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-65b6d456f6-wvlnv" podStartSLOduration=3.878348632 podStartE2EDuration="3.878348632s" podCreationTimestamp="2026-01-28 07:08:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:34.866049757 +0000 UTC m=+1086.281709927" watchObservedRunningTime="2026-01-28 07:08:34.878348632 +0000 UTC m=+1086.294008792" Jan 28 07:08:34 crc kubenswrapper[4776]: I0128 07:08:34.893046 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.893031571 podStartE2EDuration="2.893031571s" podCreationTimestamp="2026-01-28 07:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:34.887026238 +0000 UTC m=+1086.302686418" watchObservedRunningTime="2026-01-28 07:08:34.893031571 +0000 UTC m=+1086.308691731" Jan 28 07:08:34 crc kubenswrapper[4776]: I0128 07:08:34.914166 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7967c58c5f-jkrnc" podStartSLOduration=2.914146225 podStartE2EDuration="2.914146225s" podCreationTimestamp="2026-01-28 07:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:34.904977885 +0000 UTC m=+1086.320638045" watchObservedRunningTime="2026-01-28 07:08:34.914146225 +0000 UTC m=+1086.329806385" Jan 28 07:08:35 crc kubenswrapper[4776]: I0128 07:08:35.859804 4776 generic.go:334] "Generic (PLEG): container finished" podID="6377f80e-0b32-479e-b33c-fc4d9f67b299" containerID="a65362a05306196ca9edebfae4fe8396e76494779bb8a4cfe59d0edcaa6fc49e" exitCode=0 Jan 28 07:08:35 crc kubenswrapper[4776]: I0128 07:08:35.859878 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fg2nt" event={"ID":"6377f80e-0b32-479e-b33c-fc4d9f67b299","Type":"ContainerDied","Data":"a65362a05306196ca9edebfae4fe8396e76494779bb8a4cfe59d0edcaa6fc49e"} Jan 28 07:08:36 crc kubenswrapper[4776]: I0128 07:08:36.887509 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"2ba7ddaa-338f-46aa-9609-609740f34cb7","Type":"ContainerStarted","Data":"95dc261b2ec54a28ab7c6c625e9157540aa47b5bb30b13271909fe9e3c1c06b1"} Jan 28 07:08:36 crc kubenswrapper[4776]: I0128 07:08:36.908510 4776 generic.go:334] "Generic (PLEG): container finished" podID="8fcdde4b-0742-40b3-8f98-41b218f6476a" containerID="45f1332b80b2ead5b8f1bd268273ede6bc21d2179b011b1bc40478a84c9a0db2" exitCode=0 Jan 28 07:08:36 crc kubenswrapper[4776]: I0128 07:08:36.908580 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f4d58bd76-2fnlt" event={"ID":"8fcdde4b-0742-40b3-8f98-41b218f6476a","Type":"ContainerDied","Data":"45f1332b80b2ead5b8f1bd268273ede6bc21d2179b011b1bc40478a84c9a0db2"} Jan 28 07:08:36 crc kubenswrapper[4776]: I0128 07:08:36.914581 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"e0bb7f08-c9fa-4595-9b5e-b80ff3821169","Type":"ContainerStarted","Data":"3f71b369d52e98c4e9a2332ccdd4507df0cc22402d00631a533ce177a31bd5f9"} Jan 28 07:08:36 crc kubenswrapper[4776]: I0128 07:08:36.926088 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.664787418 podStartE2EDuration="4.926061878s" podCreationTimestamp="2026-01-28 07:08:32 +0000 UTC" firstStartedPulling="2026-01-28 07:08:33.429162257 +0000 UTC m=+1084.844822417" lastFinishedPulling="2026-01-28 07:08:35.690436717 +0000 UTC m=+1087.106096877" observedRunningTime="2026-01-28 07:08:36.908750298 +0000 UTC m=+1088.324410458" watchObservedRunningTime="2026-01-28 07:08:36.926061878 +0000 UTC m=+1088.341722038" Jan 28 07:08:36 crc kubenswrapper[4776]: I0128 07:08:36.938285 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.68067432 podStartE2EDuration="4.938263419s" podCreationTimestamp="2026-01-28 07:08:32 +0000 UTC" firstStartedPulling="2026-01-28 07:08:33.442173141 +0000 UTC m=+1084.857833301" lastFinishedPulling="2026-01-28 07:08:35.69976224 +0000 UTC m=+1087.115422400" observedRunningTime="2026-01-28 07:08:36.930521429 +0000 UTC m=+1088.346181589" watchObservedRunningTime="2026-01-28 07:08:36.938263419 +0000 UTC m=+1088.353923569" Jan 28 07:08:36 crc kubenswrapper[4776]: I0128 07:08:36.946407 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c7f79f5b8-2xn7l" podUID="e5180ed1-0d82-4c44-aed4-3f3a5b34af93" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.160:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.160:8443: connect: connection refused" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.000461 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-755fdfc784-krn2x" podUID="dc39478f-fee2-4eb1-89bc-789b5179a1ca" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.101508 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.226943 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.258231 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-config\") pod \"8fcdde4b-0742-40b3-8f98-41b218f6476a\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.258497 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-combined-ca-bundle\") pod \"8fcdde4b-0742-40b3-8f98-41b218f6476a\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.258594 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p48pz\" (UniqueName: \"kubernetes.io/projected/8fcdde4b-0742-40b3-8f98-41b218f6476a-kube-api-access-p48pz\") pod \"8fcdde4b-0742-40b3-8f98-41b218f6476a\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.258690 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-ovndb-tls-certs\") pod \"8fcdde4b-0742-40b3-8f98-41b218f6476a\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.258725 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-httpd-config\") pod \"8fcdde4b-0742-40b3-8f98-41b218f6476a\" (UID: \"8fcdde4b-0742-40b3-8f98-41b218f6476a\") " Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.272616 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fcdde4b-0742-40b3-8f98-41b218f6476a-kube-api-access-p48pz" (OuterVolumeSpecName: "kube-api-access-p48pz") pod "8fcdde4b-0742-40b3-8f98-41b218f6476a" (UID: "8fcdde4b-0742-40b3-8f98-41b218f6476a"). InnerVolumeSpecName "kube-api-access-p48pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.272777 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8fcdde4b-0742-40b3-8f98-41b218f6476a" (UID: "8fcdde4b-0742-40b3-8f98-41b218f6476a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.350685 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fcdde4b-0742-40b3-8f98-41b218f6476a" (UID: "8fcdde4b-0742-40b3-8f98-41b218f6476a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.350929 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8fcdde4b-0742-40b3-8f98-41b218f6476a" (UID: "8fcdde4b-0742-40b3-8f98-41b218f6476a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.364862 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.364892 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p48pz\" (UniqueName: \"kubernetes.io/projected/8fcdde4b-0742-40b3-8f98-41b218f6476a-kube-api-access-p48pz\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.364903 4776 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.364912 4776 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.388755 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-config" (OuterVolumeSpecName: "config") pod "8fcdde4b-0742-40b3-8f98-41b218f6476a" (UID: "8fcdde4b-0742-40b3-8f98-41b218f6476a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.390302 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fg2nt" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.469140 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jnkb\" (UniqueName: \"kubernetes.io/projected/6377f80e-0b32-479e-b33c-fc4d9f67b299-kube-api-access-4jnkb\") pod \"6377f80e-0b32-479e-b33c-fc4d9f67b299\" (UID: \"6377f80e-0b32-479e-b33c-fc4d9f67b299\") " Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.469367 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6377f80e-0b32-479e-b33c-fc4d9f67b299-combined-ca-bundle\") pod \"6377f80e-0b32-479e-b33c-fc4d9f67b299\" (UID: \"6377f80e-0b32-479e-b33c-fc4d9f67b299\") " Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.469514 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6377f80e-0b32-479e-b33c-fc4d9f67b299-db-sync-config-data\") pod \"6377f80e-0b32-479e-b33c-fc4d9f67b299\" (UID: \"6377f80e-0b32-479e-b33c-fc4d9f67b299\") " Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.469893 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8fcdde4b-0742-40b3-8f98-41b218f6476a-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.477688 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6377f80e-0b32-479e-b33c-fc4d9f67b299-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6377f80e-0b32-479e-b33c-fc4d9f67b299" (UID: "6377f80e-0b32-479e-b33c-fc4d9f67b299"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.478739 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6377f80e-0b32-479e-b33c-fc4d9f67b299-kube-api-access-4jnkb" (OuterVolumeSpecName: "kube-api-access-4jnkb") pod "6377f80e-0b32-479e-b33c-fc4d9f67b299" (UID: "6377f80e-0b32-479e-b33c-fc4d9f67b299"). InnerVolumeSpecName "kube-api-access-4jnkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.495966 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.502862 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6377f80e-0b32-479e-b33c-fc4d9f67b299-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6377f80e-0b32-479e-b33c-fc4d9f67b299" (UID: "6377f80e-0b32-479e-b33c-fc4d9f67b299"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.571507 4776 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6377f80e-0b32-479e-b33c-fc4d9f67b299-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.571749 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jnkb\" (UniqueName: \"kubernetes.io/projected/6377f80e-0b32-479e-b33c-fc4d9f67b299-kube-api-access-4jnkb\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.571761 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6377f80e-0b32-479e-b33c-fc4d9f67b299-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.615855 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.940291 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f4d58bd76-2fnlt" event={"ID":"8fcdde4b-0742-40b3-8f98-41b218f6476a","Type":"ContainerDied","Data":"e5c59fca3881f44d47d647801b6d60dc20440f97bab558f11dfac93f5c5e1684"} Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.940370 4776 scope.go:117] "RemoveContainer" containerID="5a3df3dd7e59af0e67d5ea3d98b2294db6ff657761526999d623884d0cbb6796" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.940566 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f4d58bd76-2fnlt" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.955006 4776 generic.go:334] "Generic (PLEG): container finished" podID="2c1400af-1c32-4f74-89f8-30b42dbb6c91" containerID="e067161ef4269cdad93881fc154e79b54eb2dd197ccd2d0bc4ef018370a84f91" exitCode=0 Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.955096 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zxp7l" event={"ID":"2c1400af-1c32-4f74-89f8-30b42dbb6c91","Type":"ContainerDied","Data":"e067161ef4269cdad93881fc154e79b54eb2dd197ccd2d0bc4ef018370a84f91"} Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.962679 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fg2nt" Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.964622 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fg2nt" event={"ID":"6377f80e-0b32-479e-b33c-fc4d9f67b299","Type":"ContainerDied","Data":"c5e09565803f99d513ec3b1bec6643c2ed0b2895e20fa36714ce5b30ddf905ca"} Jan 28 07:08:37 crc kubenswrapper[4776]: I0128 07:08:37.964683 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5e09565803f99d513ec3b1bec6643c2ed0b2895e20fa36714ce5b30ddf905ca" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.054597 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f4d58bd76-2fnlt"] Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.066674 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7f4d58bd76-2fnlt"] Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.169350 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5cc6b97df7-r6gfn"] Jan 28 07:08:38 crc kubenswrapper[4776]: E0128 07:08:38.169821 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fcdde4b-0742-40b3-8f98-41b218f6476a" containerName="neutron-httpd" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.169840 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fcdde4b-0742-40b3-8f98-41b218f6476a" containerName="neutron-httpd" Jan 28 07:08:38 crc kubenswrapper[4776]: E0128 07:08:38.169854 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6377f80e-0b32-479e-b33c-fc4d9f67b299" containerName="barbican-db-sync" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.169861 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6377f80e-0b32-479e-b33c-fc4d9f67b299" containerName="barbican-db-sync" Jan 28 07:08:38 crc kubenswrapper[4776]: E0128 07:08:38.169873 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fcdde4b-0742-40b3-8f98-41b218f6476a" containerName="neutron-api" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.169879 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fcdde4b-0742-40b3-8f98-41b218f6476a" containerName="neutron-api" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.170057 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6377f80e-0b32-479e-b33c-fc4d9f67b299" containerName="barbican-db-sync" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.170084 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fcdde4b-0742-40b3-8f98-41b218f6476a" containerName="neutron-api" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.170098 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fcdde4b-0742-40b3-8f98-41b218f6476a" containerName="neutron-httpd" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.171079 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5cc6b97df7-r6gfn" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.185125 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.185375 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-44k68" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.185519 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.187365 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87270d72-c59e-4526-b69b-ceaebfb13fdd-config-data\") pod \"barbican-worker-5cc6b97df7-r6gfn\" (UID: \"87270d72-c59e-4526-b69b-ceaebfb13fdd\") " pod="openstack/barbican-worker-5cc6b97df7-r6gfn" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.187452 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87270d72-c59e-4526-b69b-ceaebfb13fdd-config-data-custom\") pod \"barbican-worker-5cc6b97df7-r6gfn\" (UID: \"87270d72-c59e-4526-b69b-ceaebfb13fdd\") " pod="openstack/barbican-worker-5cc6b97df7-r6gfn" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.187507 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87270d72-c59e-4526-b69b-ceaebfb13fdd-combined-ca-bundle\") pod \"barbican-worker-5cc6b97df7-r6gfn\" (UID: \"87270d72-c59e-4526-b69b-ceaebfb13fdd\") " pod="openstack/barbican-worker-5cc6b97df7-r6gfn" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.187560 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs2mx\" (UniqueName: \"kubernetes.io/projected/87270d72-c59e-4526-b69b-ceaebfb13fdd-kube-api-access-vs2mx\") pod \"barbican-worker-5cc6b97df7-r6gfn\" (UID: \"87270d72-c59e-4526-b69b-ceaebfb13fdd\") " pod="openstack/barbican-worker-5cc6b97df7-r6gfn" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.187612 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87270d72-c59e-4526-b69b-ceaebfb13fdd-logs\") pod \"barbican-worker-5cc6b97df7-r6gfn\" (UID: \"87270d72-c59e-4526-b69b-ceaebfb13fdd\") " pod="openstack/barbican-worker-5cc6b97df7-r6gfn" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.225035 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5cc6b97df7-r6gfn"] Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.234110 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6f446b9874-8rzlp"] Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.235563 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.241936 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.253794 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6f446b9874-8rzlp"] Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.288970 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87270d72-c59e-4526-b69b-ceaebfb13fdd-config-data\") pod \"barbican-worker-5cc6b97df7-r6gfn\" (UID: \"87270d72-c59e-4526-b69b-ceaebfb13fdd\") " pod="openstack/barbican-worker-5cc6b97df7-r6gfn" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.289043 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87270d72-c59e-4526-b69b-ceaebfb13fdd-config-data-custom\") pod \"barbican-worker-5cc6b97df7-r6gfn\" (UID: \"87270d72-c59e-4526-b69b-ceaebfb13fdd\") " pod="openstack/barbican-worker-5cc6b97df7-r6gfn" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.289094 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87270d72-c59e-4526-b69b-ceaebfb13fdd-combined-ca-bundle\") pod \"barbican-worker-5cc6b97df7-r6gfn\" (UID: \"87270d72-c59e-4526-b69b-ceaebfb13fdd\") " pod="openstack/barbican-worker-5cc6b97df7-r6gfn" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.289132 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs2mx\" (UniqueName: \"kubernetes.io/projected/87270d72-c59e-4526-b69b-ceaebfb13fdd-kube-api-access-vs2mx\") pod \"barbican-worker-5cc6b97df7-r6gfn\" (UID: \"87270d72-c59e-4526-b69b-ceaebfb13fdd\") " pod="openstack/barbican-worker-5cc6b97df7-r6gfn" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.289188 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87270d72-c59e-4526-b69b-ceaebfb13fdd-logs\") pod \"barbican-worker-5cc6b97df7-r6gfn\" (UID: \"87270d72-c59e-4526-b69b-ceaebfb13fdd\") " pod="openstack/barbican-worker-5cc6b97df7-r6gfn" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.290023 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87270d72-c59e-4526-b69b-ceaebfb13fdd-logs\") pod \"barbican-worker-5cc6b97df7-r6gfn\" (UID: \"87270d72-c59e-4526-b69b-ceaebfb13fdd\") " pod="openstack/barbican-worker-5cc6b97df7-r6gfn" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.302118 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87270d72-c59e-4526-b69b-ceaebfb13fdd-combined-ca-bundle\") pod \"barbican-worker-5cc6b97df7-r6gfn\" (UID: \"87270d72-c59e-4526-b69b-ceaebfb13fdd\") " pod="openstack/barbican-worker-5cc6b97df7-r6gfn" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.306332 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87270d72-c59e-4526-b69b-ceaebfb13fdd-config-data-custom\") pod \"barbican-worker-5cc6b97df7-r6gfn\" (UID: \"87270d72-c59e-4526-b69b-ceaebfb13fdd\") " pod="openstack/barbican-worker-5cc6b97df7-r6gfn" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.310366 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87270d72-c59e-4526-b69b-ceaebfb13fdd-config-data\") pod \"barbican-worker-5cc6b97df7-r6gfn\" (UID: \"87270d72-c59e-4526-b69b-ceaebfb13fdd\") " pod="openstack/barbican-worker-5cc6b97df7-r6gfn" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.310794 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tmwn7"] Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.312393 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.314256 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs2mx\" (UniqueName: \"kubernetes.io/projected/87270d72-c59e-4526-b69b-ceaebfb13fdd-kube-api-access-vs2mx\") pod \"barbican-worker-5cc6b97df7-r6gfn\" (UID: \"87270d72-c59e-4526-b69b-ceaebfb13fdd\") " pod="openstack/barbican-worker-5cc6b97df7-r6gfn" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.330124 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tmwn7"] Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.391747 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3092241e-a9e3-4c51-b31b-36eae29a52e1-combined-ca-bundle\") pod \"barbican-keystone-listener-6f446b9874-8rzlp\" (UID: \"3092241e-a9e3-4c51-b31b-36eae29a52e1\") " pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.392055 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3092241e-a9e3-4c51-b31b-36eae29a52e1-config-data-custom\") pod \"barbican-keystone-listener-6f446b9874-8rzlp\" (UID: \"3092241e-a9e3-4c51-b31b-36eae29a52e1\") " pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.392097 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3092241e-a9e3-4c51-b31b-36eae29a52e1-config-data\") pod \"barbican-keystone-listener-6f446b9874-8rzlp\" (UID: \"3092241e-a9e3-4c51-b31b-36eae29a52e1\") " pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.392155 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5mtw\" (UniqueName: \"kubernetes.io/projected/3092241e-a9e3-4c51-b31b-36eae29a52e1-kube-api-access-b5mtw\") pod \"barbican-keystone-listener-6f446b9874-8rzlp\" (UID: \"3092241e-a9e3-4c51-b31b-36eae29a52e1\") " pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.392199 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3092241e-a9e3-4c51-b31b-36eae29a52e1-logs\") pod \"barbican-keystone-listener-6f446b9874-8rzlp\" (UID: \"3092241e-a9e3-4c51-b31b-36eae29a52e1\") " pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.410822 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5f8bdb5bcd-hfpbh"] Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.412302 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.419938 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.425476 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f8bdb5bcd-hfpbh"] Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.494817 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3092241e-a9e3-4c51-b31b-36eae29a52e1-config-data-custom\") pod \"barbican-keystone-listener-6f446b9874-8rzlp\" (UID: \"3092241e-a9e3-4c51-b31b-36eae29a52e1\") " pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.494874 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3092241e-a9e3-4c51-b31b-36eae29a52e1-config-data\") pod \"barbican-keystone-listener-6f446b9874-8rzlp\" (UID: \"3092241e-a9e3-4c51-b31b-36eae29a52e1\") " pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.494908 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-config\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.494959 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm4dv\" (UniqueName: \"kubernetes.io/projected/4fca1f5d-ea85-428d-ade1-b42d48e9718c-kube-api-access-mm4dv\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.494990 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.495008 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5mtw\" (UniqueName: \"kubernetes.io/projected/3092241e-a9e3-4c51-b31b-36eae29a52e1-kube-api-access-b5mtw\") pod \"barbican-keystone-listener-6f446b9874-8rzlp\" (UID: \"3092241e-a9e3-4c51-b31b-36eae29a52e1\") " pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.495045 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3092241e-a9e3-4c51-b31b-36eae29a52e1-logs\") pod \"barbican-keystone-listener-6f446b9874-8rzlp\" (UID: \"3092241e-a9e3-4c51-b31b-36eae29a52e1\") " pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.495082 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.495096 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.495162 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3092241e-a9e3-4c51-b31b-36eae29a52e1-combined-ca-bundle\") pod \"barbican-keystone-listener-6f446b9874-8rzlp\" (UID: \"3092241e-a9e3-4c51-b31b-36eae29a52e1\") " pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.495181 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.496197 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3092241e-a9e3-4c51-b31b-36eae29a52e1-logs\") pod \"barbican-keystone-listener-6f446b9874-8rzlp\" (UID: \"3092241e-a9e3-4c51-b31b-36eae29a52e1\") " pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.502500 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3092241e-a9e3-4c51-b31b-36eae29a52e1-config-data-custom\") pod \"barbican-keystone-listener-6f446b9874-8rzlp\" (UID: \"3092241e-a9e3-4c51-b31b-36eae29a52e1\") " pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.509310 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3092241e-a9e3-4c51-b31b-36eae29a52e1-combined-ca-bundle\") pod \"barbican-keystone-listener-6f446b9874-8rzlp\" (UID: \"3092241e-a9e3-4c51-b31b-36eae29a52e1\") " pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.510001 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3092241e-a9e3-4c51-b31b-36eae29a52e1-config-data\") pod \"barbican-keystone-listener-6f446b9874-8rzlp\" (UID: \"3092241e-a9e3-4c51-b31b-36eae29a52e1\") " pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.517659 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5mtw\" (UniqueName: \"kubernetes.io/projected/3092241e-a9e3-4c51-b31b-36eae29a52e1-kube-api-access-b5mtw\") pod \"barbican-keystone-listener-6f446b9874-8rzlp\" (UID: \"3092241e-a9e3-4c51-b31b-36eae29a52e1\") " pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.526073 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5cc6b97df7-r6gfn" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.597162 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.597199 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.597226 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z467\" (UniqueName: \"kubernetes.io/projected/67280a26-5583-4b51-b74d-c3e8b2ea6645-kube-api-access-8z467\") pod \"barbican-api-5f8bdb5bcd-hfpbh\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.597265 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67280a26-5583-4b51-b74d-c3e8b2ea6645-logs\") pod \"barbican-api-5f8bdb5bcd-hfpbh\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.597299 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.597342 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-config\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.597361 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-config-data-custom\") pod \"barbican-api-5f8bdb5bcd-hfpbh\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.597389 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm4dv\" (UniqueName: \"kubernetes.io/projected/4fca1f5d-ea85-428d-ade1-b42d48e9718c-kube-api-access-mm4dv\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.597410 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-combined-ca-bundle\") pod \"barbican-api-5f8bdb5bcd-hfpbh\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.597424 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.597442 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-config-data\") pod \"barbican-api-5f8bdb5bcd-hfpbh\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.598008 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.598038 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.598758 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-config\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.599101 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.599494 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.601416 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.624269 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm4dv\" (UniqueName: \"kubernetes.io/projected/4fca1f5d-ea85-428d-ade1-b42d48e9718c-kube-api-access-mm4dv\") pod \"dnsmasq-dns-85ff748b95-tmwn7\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.696478 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.698422 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z467\" (UniqueName: \"kubernetes.io/projected/67280a26-5583-4b51-b74d-c3e8b2ea6645-kube-api-access-8z467\") pod \"barbican-api-5f8bdb5bcd-hfpbh\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.698833 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67280a26-5583-4b51-b74d-c3e8b2ea6645-logs\") pod \"barbican-api-5f8bdb5bcd-hfpbh\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.698945 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-config-data-custom\") pod \"barbican-api-5f8bdb5bcd-hfpbh\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.699034 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-combined-ca-bundle\") pod \"barbican-api-5f8bdb5bcd-hfpbh\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.699253 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-config-data\") pod \"barbican-api-5f8bdb5bcd-hfpbh\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.703325 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67280a26-5583-4b51-b74d-c3e8b2ea6645-logs\") pod \"barbican-api-5f8bdb5bcd-hfpbh\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.705764 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-combined-ca-bundle\") pod \"barbican-api-5f8bdb5bcd-hfpbh\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.717756 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-config-data\") pod \"barbican-api-5f8bdb5bcd-hfpbh\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.722199 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z467\" (UniqueName: \"kubernetes.io/projected/67280a26-5583-4b51-b74d-c3e8b2ea6645-kube-api-access-8z467\") pod \"barbican-api-5f8bdb5bcd-hfpbh\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.724078 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-config-data-custom\") pod \"barbican-api-5f8bdb5bcd-hfpbh\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:38 crc kubenswrapper[4776]: I0128 07:08:38.753733 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:39 crc kubenswrapper[4776]: I0128 07:08:39.330676 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fcdde4b-0742-40b3-8f98-41b218f6476a" path="/var/lib/kubelet/pods/8fcdde4b-0742-40b3-8f98-41b218f6476a/volumes" Jan 28 07:08:40 crc kubenswrapper[4776]: I0128 07:08:40.866920 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-55df755858-s7sbs"] Jan 28 07:08:40 crc kubenswrapper[4776]: I0128 07:08:40.871697 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:40 crc kubenswrapper[4776]: I0128 07:08:40.876276 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 28 07:08:40 crc kubenswrapper[4776]: I0128 07:08:40.876510 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 28 07:08:40 crc kubenswrapper[4776]: I0128 07:08:40.888328 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55df755858-s7sbs"] Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.055359 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52586b79-6cf4-475f-852d-aa3c903b5b38-logs\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.055411 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52586b79-6cf4-475f-852d-aa3c903b5b38-internal-tls-certs\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.055695 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52586b79-6cf4-475f-852d-aa3c903b5b38-config-data\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.055766 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52586b79-6cf4-475f-852d-aa3c903b5b38-config-data-custom\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.055877 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52586b79-6cf4-475f-852d-aa3c903b5b38-public-tls-certs\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.055933 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52586b79-6cf4-475f-852d-aa3c903b5b38-combined-ca-bundle\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.056098 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m56k5\" (UniqueName: \"kubernetes.io/projected/52586b79-6cf4-475f-852d-aa3c903b5b38-kube-api-access-m56k5\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.162085 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52586b79-6cf4-475f-852d-aa3c903b5b38-logs\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.162149 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52586b79-6cf4-475f-852d-aa3c903b5b38-internal-tls-certs\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.162343 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52586b79-6cf4-475f-852d-aa3c903b5b38-config-data\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.162571 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52586b79-6cf4-475f-852d-aa3c903b5b38-logs\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.162389 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52586b79-6cf4-475f-852d-aa3c903b5b38-config-data-custom\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.163303 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52586b79-6cf4-475f-852d-aa3c903b5b38-public-tls-certs\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.163341 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52586b79-6cf4-475f-852d-aa3c903b5b38-combined-ca-bundle\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.163531 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m56k5\" (UniqueName: \"kubernetes.io/projected/52586b79-6cf4-475f-852d-aa3c903b5b38-kube-api-access-m56k5\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.170445 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52586b79-6cf4-475f-852d-aa3c903b5b38-config-data\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.178864 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52586b79-6cf4-475f-852d-aa3c903b5b38-internal-tls-certs\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.180070 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m56k5\" (UniqueName: \"kubernetes.io/projected/52586b79-6cf4-475f-852d-aa3c903b5b38-kube-api-access-m56k5\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.181130 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52586b79-6cf4-475f-852d-aa3c903b5b38-config-data-custom\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.182383 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52586b79-6cf4-475f-852d-aa3c903b5b38-public-tls-certs\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.185949 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52586b79-6cf4-475f-852d-aa3c903b5b38-combined-ca-bundle\") pod \"barbican-api-55df755858-s7sbs\" (UID: \"52586b79-6cf4-475f-852d-aa3c903b5b38\") " pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:41 crc kubenswrapper[4776]: I0128 07:08:41.239288 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:42 crc kubenswrapper[4776]: I0128 07:08:42.495599 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Jan 28 07:08:42 crc kubenswrapper[4776]: I0128 07:08:42.501748 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Jan 28 07:08:42 crc kubenswrapper[4776]: I0128 07:08:42.521286 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 28 07:08:42 crc kubenswrapper[4776]: I0128 07:08:42.552320 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Jan 28 07:08:42 crc kubenswrapper[4776]: I0128 07:08:42.615771 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Jan 28 07:08:42 crc kubenswrapper[4776]: I0128 07:08:42.639323 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Jan 28 07:08:43 crc kubenswrapper[4776]: I0128 07:08:43.057070 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Jan 28 07:08:43 crc kubenswrapper[4776]: I0128 07:08:43.060836 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Jan 28 07:08:43 crc kubenswrapper[4776]: I0128 07:08:43.095758 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Jan 28 07:08:43 crc kubenswrapper[4776]: I0128 07:08:43.101269 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Jan 28 07:08:43 crc kubenswrapper[4776]: I0128 07:08:43.834627 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:08:43 crc kubenswrapper[4776]: I0128 07:08:43.921413 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1400af-1c32-4f74-89f8-30b42dbb6c91-etc-machine-id\") pod \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " Jan 28 07:08:43 crc kubenswrapper[4776]: I0128 07:08:43.921532 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-scripts\") pod \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " Jan 28 07:08:43 crc kubenswrapper[4776]: I0128 07:08:43.921636 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-combined-ca-bundle\") pod \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " Jan 28 07:08:43 crc kubenswrapper[4776]: I0128 07:08:43.921697 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-db-sync-config-data\") pod \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " Jan 28 07:08:43 crc kubenswrapper[4776]: I0128 07:08:43.921746 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4frmg\" (UniqueName: \"kubernetes.io/projected/2c1400af-1c32-4f74-89f8-30b42dbb6c91-kube-api-access-4frmg\") pod \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " Jan 28 07:08:43 crc kubenswrapper[4776]: I0128 07:08:43.921790 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-config-data\") pod \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\" (UID: \"2c1400af-1c32-4f74-89f8-30b42dbb6c91\") " Jan 28 07:08:43 crc kubenswrapper[4776]: I0128 07:08:43.922845 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c1400af-1c32-4f74-89f8-30b42dbb6c91-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2c1400af-1c32-4f74-89f8-30b42dbb6c91" (UID: "2c1400af-1c32-4f74-89f8-30b42dbb6c91"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:08:43 crc kubenswrapper[4776]: I0128 07:08:43.927701 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2c1400af-1c32-4f74-89f8-30b42dbb6c91" (UID: "2c1400af-1c32-4f74-89f8-30b42dbb6c91"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:43 crc kubenswrapper[4776]: I0128 07:08:43.928870 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c1400af-1c32-4f74-89f8-30b42dbb6c91-kube-api-access-4frmg" (OuterVolumeSpecName: "kube-api-access-4frmg") pod "2c1400af-1c32-4f74-89f8-30b42dbb6c91" (UID: "2c1400af-1c32-4f74-89f8-30b42dbb6c91"). InnerVolumeSpecName "kube-api-access-4frmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:43 crc kubenswrapper[4776]: I0128 07:08:43.930338 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-scripts" (OuterVolumeSpecName: "scripts") pod "2c1400af-1c32-4f74-89f8-30b42dbb6c91" (UID: "2c1400af-1c32-4f74-89f8-30b42dbb6c91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:43 crc kubenswrapper[4776]: I0128 07:08:43.966120 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c1400af-1c32-4f74-89f8-30b42dbb6c91" (UID: "2c1400af-1c32-4f74-89f8-30b42dbb6c91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:43 crc kubenswrapper[4776]: I0128 07:08:43.992878 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-config-data" (OuterVolumeSpecName: "config-data") pod "2c1400af-1c32-4f74-89f8-30b42dbb6c91" (UID: "2c1400af-1c32-4f74-89f8-30b42dbb6c91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:44 crc kubenswrapper[4776]: I0128 07:08:44.024593 4776 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:44 crc kubenswrapper[4776]: I0128 07:08:44.024629 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4frmg\" (UniqueName: \"kubernetes.io/projected/2c1400af-1c32-4f74-89f8-30b42dbb6c91-kube-api-access-4frmg\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:44 crc kubenswrapper[4776]: I0128 07:08:44.024643 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:44 crc kubenswrapper[4776]: I0128 07:08:44.024656 4776 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1400af-1c32-4f74-89f8-30b42dbb6c91-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:44 crc kubenswrapper[4776]: I0128 07:08:44.024668 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:44 crc kubenswrapper[4776]: I0128 07:08:44.024678 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1400af-1c32-4f74-89f8-30b42dbb6c91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:44 crc kubenswrapper[4776]: I0128 07:08:44.069052 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zxp7l" event={"ID":"2c1400af-1c32-4f74-89f8-30b42dbb6c91","Type":"ContainerDied","Data":"948618003b95dbfc8368e471ce399b796f03578a5347653e9228ef78f02d2ce1"} Jan 28 07:08:44 crc kubenswrapper[4776]: I0128 07:08:44.069109 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="948618003b95dbfc8368e471ce399b796f03578a5347653e9228ef78f02d2ce1" Jan 28 07:08:44 crc kubenswrapper[4776]: I0128 07:08:44.069743 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zxp7l" Jan 28 07:08:44 crc kubenswrapper[4776]: I0128 07:08:44.413674 4776 scope.go:117] "RemoveContainer" containerID="45f1332b80b2ead5b8f1bd268273ede6bc21d2179b011b1bc40478a84c9a0db2" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.006669 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5cc6b97df7-r6gfn"] Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.094120 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 07:08:45 crc kubenswrapper[4776]: E0128 07:08:45.094591 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c1400af-1c32-4f74-89f8-30b42dbb6c91" containerName="cinder-db-sync" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.094605 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c1400af-1c32-4f74-89f8-30b42dbb6c91" containerName="cinder-db-sync" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.094780 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c1400af-1c32-4f74-89f8-30b42dbb6c91" containerName="cinder-db-sync" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.095773 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.101141 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.101488 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5cc6b97df7-r6gfn" event={"ID":"87270d72-c59e-4526-b69b-ceaebfb13fdd","Type":"ContainerStarted","Data":"5656d4eb3127711817b7fe9ba13947ad606843bb9b4a77b7abaf4a0b45bef8bd"} Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.102042 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.102256 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-2b87g" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.102422 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.115989 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55df755858-s7sbs"] Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.134355 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.143565 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801e91e9-f527-41e3-9468-9ec9e9ec8f3c","Type":"ContainerStarted","Data":"98313f27f6ba916df3d40025fdc0eabe636c7163a08080039d8d250f38bf89cf"} Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.143658 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerName="ceilometer-central-agent" containerID="cri-o://4426b54f84a837fd86eaba7c242d64b2a2c3ca6628dd96efd9d15a39607dad21" gracePeriod=30 Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.143714 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.143736 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerName="proxy-httpd" containerID="cri-o://98313f27f6ba916df3d40025fdc0eabe636c7163a08080039d8d250f38bf89cf" gracePeriod=30 Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.143779 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerName="sg-core" containerID="cri-o://d6233f5f0aa104656ea27391cc344e2986118f551562125632b837160e46818c" gracePeriod=30 Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.143809 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerName="ceilometer-notification-agent" containerID="cri-o://43accac2de5072edeb03536a049dce90caf20f7eff3b337ec8fbdc4c6b66f285" gracePeriod=30 Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.153689 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phhn4\" (UniqueName: \"kubernetes.io/projected/0c200544-1881-4bdf-9e84-78ff9ccd8712-kube-api-access-phhn4\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.153734 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.153756 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c200544-1881-4bdf-9e84-78ff9ccd8712-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.153793 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.153821 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-scripts\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.153839 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-config-data\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.170830 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tmwn7"] Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.195905 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jklmf"] Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.197539 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.235785 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jklmf"] Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.256687 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c200544-1881-4bdf-9e84-78ff9ccd8712-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.256746 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-config\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.256795 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.256848 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.256872 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.256908 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-scripts\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.256930 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-config-data\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.257009 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.257056 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.257153 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd4lw\" (UniqueName: \"kubernetes.io/projected/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-kube-api-access-pd4lw\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.257181 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phhn4\" (UniqueName: \"kubernetes.io/projected/0c200544-1881-4bdf-9e84-78ff9ccd8712-kube-api-access-phhn4\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.257216 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.257999 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c200544-1881-4bdf-9e84-78ff9ccd8712-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.262569 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.465509736 podStartE2EDuration="57.262528198s" podCreationTimestamp="2026-01-28 07:07:48 +0000 UTC" firstStartedPulling="2026-01-28 07:07:49.734202617 +0000 UTC m=+1041.149862777" lastFinishedPulling="2026-01-28 07:08:44.531221069 +0000 UTC m=+1095.946881239" observedRunningTime="2026-01-28 07:08:45.18607313 +0000 UTC m=+1096.601733290" watchObservedRunningTime="2026-01-28 07:08:45.262528198 +0000 UTC m=+1096.678188358" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.279960 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.283999 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-scripts\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.284774 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.301094 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phhn4\" (UniqueName: \"kubernetes.io/projected/0c200544-1881-4bdf-9e84-78ff9ccd8712-kube-api-access-phhn4\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.301539 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-config-data\") pod \"cinder-scheduler-0\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.301601 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f8bdb5bcd-hfpbh"] Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.360408 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.360693 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.360777 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd4lw\" (UniqueName: \"kubernetes.io/projected/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-kube-api-access-pd4lw\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.360813 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-config\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.360844 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.360877 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.361667 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.362256 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.362758 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.363531 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.363822 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-config\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.401168 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd4lw\" (UniqueName: \"kubernetes.io/projected/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-kube-api-access-pd4lw\") pod \"dnsmasq-dns-5c9776ccc5-jklmf\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.415424 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6f446b9874-8rzlp"] Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.415458 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tmwn7"] Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.415473 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.416798 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.422567 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.427726 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.466237 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-config-data-custom\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.466286 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57d343c3-38e8-4585-9d8b-84410907752f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.466391 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zkxq\" (UniqueName: \"kubernetes.io/projected/57d343c3-38e8-4585-9d8b-84410907752f-kube-api-access-8zkxq\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.466407 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.466428 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-config-data\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.466468 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-scripts\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.466616 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57d343c3-38e8-4585-9d8b-84410907752f-logs\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.490046 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.567903 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zkxq\" (UniqueName: \"kubernetes.io/projected/57d343c3-38e8-4585-9d8b-84410907752f-kube-api-access-8zkxq\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.567938 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.567965 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-config-data\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.568008 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-scripts\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.568083 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57d343c3-38e8-4585-9d8b-84410907752f-logs\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.568110 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-config-data-custom\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.568131 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57d343c3-38e8-4585-9d8b-84410907752f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.573530 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57d343c3-38e8-4585-9d8b-84410907752f-logs\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.573906 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-config-data\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.574142 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57d343c3-38e8-4585-9d8b-84410907752f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.574251 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-scripts\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.576671 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.576720 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-config-data-custom\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.595352 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zkxq\" (UniqueName: \"kubernetes.io/projected/57d343c3-38e8-4585-9d8b-84410907752f-kube-api-access-8zkxq\") pod \"cinder-api-0\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " pod="openstack/cinder-api-0" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.660061 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:45 crc kubenswrapper[4776]: I0128 07:08:45.772017 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.157264 4776 generic.go:334] "Generic (PLEG): container finished" podID="4fca1f5d-ea85-428d-ade1-b42d48e9718c" containerID="33d1874bee99c761d4e48d0fcd13066c3a5eb76af2d64f1f747df1de535afaa0" exitCode=0 Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.157839 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" event={"ID":"4fca1f5d-ea85-428d-ade1-b42d48e9718c","Type":"ContainerDied","Data":"33d1874bee99c761d4e48d0fcd13066c3a5eb76af2d64f1f747df1de535afaa0"} Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.157866 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" event={"ID":"4fca1f5d-ea85-428d-ade1-b42d48e9718c","Type":"ContainerStarted","Data":"936260a91d7127c300529e3a27d9dd56283c9705b99a11822bbd79b031d32b25"} Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.162111 4776 generic.go:334] "Generic (PLEG): container finished" podID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerID="98313f27f6ba916df3d40025fdc0eabe636c7163a08080039d8d250f38bf89cf" exitCode=0 Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.162134 4776 generic.go:334] "Generic (PLEG): container finished" podID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerID="d6233f5f0aa104656ea27391cc344e2986118f551562125632b837160e46818c" exitCode=2 Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.162141 4776 generic.go:334] "Generic (PLEG): container finished" podID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerID="4426b54f84a837fd86eaba7c242d64b2a2c3ca6628dd96efd9d15a39607dad21" exitCode=0 Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.162176 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801e91e9-f527-41e3-9468-9ec9e9ec8f3c","Type":"ContainerDied","Data":"98313f27f6ba916df3d40025fdc0eabe636c7163a08080039d8d250f38bf89cf"} Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.162194 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801e91e9-f527-41e3-9468-9ec9e9ec8f3c","Type":"ContainerDied","Data":"d6233f5f0aa104656ea27391cc344e2986118f551562125632b837160e46818c"} Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.162203 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801e91e9-f527-41e3-9468-9ec9e9ec8f3c","Type":"ContainerDied","Data":"4426b54f84a837fd86eaba7c242d64b2a2c3ca6628dd96efd9d15a39607dad21"} Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.164297 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" event={"ID":"3092241e-a9e3-4c51-b31b-36eae29a52e1","Type":"ContainerStarted","Data":"7b54aff0e14a7f8384c4745e716654f6995801a823578b9ecebc8d61267742cc"} Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.166681 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" event={"ID":"67280a26-5583-4b51-b74d-c3e8b2ea6645","Type":"ContainerStarted","Data":"23fbf1c9785b9a94be009a7433f7b499907b6ba866fb9ebe80c4bf80e8e0acc8"} Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.166702 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" event={"ID":"67280a26-5583-4b51-b74d-c3e8b2ea6645","Type":"ContainerStarted","Data":"e4fbff98c2922fc9943bdbc500cd81eb6b0067debf0fa2f64e36da2689edad94"} Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.166711 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" event={"ID":"67280a26-5583-4b51-b74d-c3e8b2ea6645","Type":"ContainerStarted","Data":"b42c6d9d677c037bbecfa068a9d87b2426aaa24828e777aaea0f4ff8f99400c5"} Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.168772 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.168914 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.171040 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55df755858-s7sbs" event={"ID":"52586b79-6cf4-475f-852d-aa3c903b5b38","Type":"ContainerStarted","Data":"724af12f0797f3af2d5f3bfb163bb8dc52517e75048c5c00575c7301cf9db112"} Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.171070 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55df755858-s7sbs" event={"ID":"52586b79-6cf4-475f-852d-aa3c903b5b38","Type":"ContainerStarted","Data":"6bf3cca958f70f2c48cd56efa42a0edd984d3a2f8fde856f8af05273e940c596"} Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.171079 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55df755858-s7sbs" event={"ID":"52586b79-6cf4-475f-852d-aa3c903b5b38","Type":"ContainerStarted","Data":"08c772a174109972f4b09be58a47415c0b0714aa1d4f36831b27f750a7be27b9"} Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.172007 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.172512 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.197139 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-55df755858-s7sbs" podStartSLOduration=6.19711764 podStartE2EDuration="6.19711764s" podCreationTimestamp="2026-01-28 07:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:46.190427459 +0000 UTC m=+1097.606087629" watchObservedRunningTime="2026-01-28 07:08:46.19711764 +0000 UTC m=+1097.612777800" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.222032 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" podStartSLOduration=8.222011206 podStartE2EDuration="8.222011206s" podCreationTimestamp="2026-01-28 07:08:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:46.211893751 +0000 UTC m=+1097.627553911" watchObservedRunningTime="2026-01-28 07:08:46.222011206 +0000 UTC m=+1097.637671366" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.308506 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.403487 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jklmf"] Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.524052 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.664731 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.754236 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-dns-svc\") pod \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.754315 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-ovsdbserver-sb\") pod \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.754429 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-dns-swift-storage-0\") pod \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.754447 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-config\") pod \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.754501 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-ovsdbserver-nb\") pod \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.754540 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm4dv\" (UniqueName: \"kubernetes.io/projected/4fca1f5d-ea85-428d-ade1-b42d48e9718c-kube-api-access-mm4dv\") pod \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\" (UID: \"4fca1f5d-ea85-428d-ade1-b42d48e9718c\") " Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.763413 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fca1f5d-ea85-428d-ade1-b42d48e9718c-kube-api-access-mm4dv" (OuterVolumeSpecName: "kube-api-access-mm4dv") pod "4fca1f5d-ea85-428d-ade1-b42d48e9718c" (UID: "4fca1f5d-ea85-428d-ade1-b42d48e9718c"). InnerVolumeSpecName "kube-api-access-mm4dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.781477 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4fca1f5d-ea85-428d-ade1-b42d48e9718c" (UID: "4fca1f5d-ea85-428d-ade1-b42d48e9718c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.783708 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4fca1f5d-ea85-428d-ade1-b42d48e9718c" (UID: "4fca1f5d-ea85-428d-ade1-b42d48e9718c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.786499 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4fca1f5d-ea85-428d-ade1-b42d48e9718c" (UID: "4fca1f5d-ea85-428d-ade1-b42d48e9718c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.788245 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4fca1f5d-ea85-428d-ade1-b42d48e9718c" (UID: "4fca1f5d-ea85-428d-ade1-b42d48e9718c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.796120 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-config" (OuterVolumeSpecName: "config") pod "4fca1f5d-ea85-428d-ade1-b42d48e9718c" (UID: "4fca1f5d-ea85-428d-ade1-b42d48e9718c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.856767 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm4dv\" (UniqueName: \"kubernetes.io/projected/4fca1f5d-ea85-428d-ade1-b42d48e9718c-kube-api-access-mm4dv\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.856798 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.856808 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.856817 4776 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.856825 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:46 crc kubenswrapper[4776]: I0128 07:08:46.856834 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fca1f5d-ea85-428d-ade1-b42d48e9718c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:47 crc kubenswrapper[4776]: I0128 07:08:47.184877 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" event={"ID":"4fca1f5d-ea85-428d-ade1-b42d48e9718c","Type":"ContainerDied","Data":"936260a91d7127c300529e3a27d9dd56283c9705b99a11822bbd79b031d32b25"} Jan 28 07:08:47 crc kubenswrapper[4776]: I0128 07:08:47.184908 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tmwn7" Jan 28 07:08:47 crc kubenswrapper[4776]: I0128 07:08:47.184927 4776 scope.go:117] "RemoveContainer" containerID="33d1874bee99c761d4e48d0fcd13066c3a5eb76af2d64f1f747df1de535afaa0" Jan 28 07:08:47 crc kubenswrapper[4776]: I0128 07:08:47.186443 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"57d343c3-38e8-4585-9d8b-84410907752f","Type":"ContainerStarted","Data":"febf7455145b4187d50e5224ae3853678d95942c87bac98ff2d2e76d5b1f437c"} Jan 28 07:08:47 crc kubenswrapper[4776]: I0128 07:08:47.192726 4776 generic.go:334] "Generic (PLEG): container finished" podID="ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd" containerID="00a7e038847214ef057a4d05eed478b4719250f7075da6593871df61060a45d3" exitCode=0 Jan 28 07:08:47 crc kubenswrapper[4776]: I0128 07:08:47.192818 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" event={"ID":"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd","Type":"ContainerDied","Data":"00a7e038847214ef057a4d05eed478b4719250f7075da6593871df61060a45d3"} Jan 28 07:08:47 crc kubenswrapper[4776]: I0128 07:08:47.192871 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" event={"ID":"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd","Type":"ContainerStarted","Data":"d696e3e8455feef20c6c6b1208786b41f2289d81cff0f9a09b550f0c0cf36998"} Jan 28 07:08:47 crc kubenswrapper[4776]: I0128 07:08:47.211796 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0c200544-1881-4bdf-9e84-78ff9ccd8712","Type":"ContainerStarted","Data":"e9513817e7d010d87216954fe669fc4175b587b369c25b1bd087c8e4a6274158"} Jan 28 07:08:47 crc kubenswrapper[4776]: I0128 07:08:47.266241 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tmwn7"] Jan 28 07:08:47 crc kubenswrapper[4776]: I0128 07:08:47.278204 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tmwn7"] Jan 28 07:08:47 crc kubenswrapper[4776]: I0128 07:08:47.318290 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fca1f5d-ea85-428d-ade1-b42d48e9718c" path="/var/lib/kubelet/pods/4fca1f5d-ea85-428d-ade1-b42d48e9718c/volumes" Jan 28 07:08:48 crc kubenswrapper[4776]: I0128 07:08:48.222507 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"57d343c3-38e8-4585-9d8b-84410907752f","Type":"ContainerStarted","Data":"199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab"} Jan 28 07:08:48 crc kubenswrapper[4776]: I0128 07:08:48.236367 4776 generic.go:334] "Generic (PLEG): container finished" podID="2c32e828-bea4-4a05-9492-31124e2964e1" containerID="8931943c83fbf0139c973a9c37200ccae6895d638d5e1560eb0be4b1bb1fda3e" exitCode=137 Jan 28 07:08:48 crc kubenswrapper[4776]: I0128 07:08:48.236405 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f66fd5db5-sd687" event={"ID":"2c32e828-bea4-4a05-9492-31124e2964e1","Type":"ContainerDied","Data":"8931943c83fbf0139c973a9c37200ccae6895d638d5e1560eb0be4b1bb1fda3e"} Jan 28 07:08:48 crc kubenswrapper[4776]: W0128 07:08:48.287303 4776 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fca1f5d_ea85_428d_ade1_b42d48e9718c.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fca1f5d_ea85_428d_ade1_b42d48e9718c.slice: no such file or directory Jan 28 07:08:48 crc kubenswrapper[4776]: W0128 07:08:48.303166 4776 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod801e91e9_f527_41e3_9468_9ec9e9ec8f3c.slice/crio-conmon-98313f27f6ba916df3d40025fdc0eabe636c7163a08080039d8d250f38bf89cf.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod801e91e9_f527_41e3_9468_9ec9e9ec8f3c.slice/crio-conmon-98313f27f6ba916df3d40025fdc0eabe636c7163a08080039d8d250f38bf89cf.scope: no such file or directory Jan 28 07:08:48 crc kubenswrapper[4776]: W0128 07:08:48.303210 4776 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod801e91e9_f527_41e3_9468_9ec9e9ec8f3c.slice/crio-98313f27f6ba916df3d40025fdc0eabe636c7163a08080039d8d250f38bf89cf.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod801e91e9_f527_41e3_9468_9ec9e9ec8f3c.slice/crio-98313f27f6ba916df3d40025fdc0eabe636c7163a08080039d8d250f38bf89cf.scope: no such file or directory Jan 28 07:08:48 crc kubenswrapper[4776]: E0128 07:08:48.706108 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fcdde4b_0742_40b3_8f98_41b218f6476a.slice/crio-conmon-45f1332b80b2ead5b8f1bd268273ede6bc21d2179b011b1bc40478a84c9a0db2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fcdde4b_0742_40b3_8f98_41b218f6476a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6377f80e_0b32_479e_b33c_fc4d9f67b299.slice/crio-a65362a05306196ca9edebfae4fe8396e76494779bb8a4cfe59d0edcaa6fc49e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c1400af_1c32_4f74_89f8_30b42dbb6c91.slice/crio-e067161ef4269cdad93881fc154e79b54eb2dd197ccd2d0bc4ef018370a84f91.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6377f80e_0b32_479e_b33c_fc4d9f67b299.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod801e91e9_f527_41e3_9468_9ec9e9ec8f3c.slice/crio-conmon-d6233f5f0aa104656ea27391cc344e2986118f551562125632b837160e46818c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6377f80e_0b32_479e_b33c_fc4d9f67b299.slice/crio-conmon-a65362a05306196ca9edebfae4fe8396e76494779bb8a4cfe59d0edcaa6fc49e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c1400af_1c32_4f74_89f8_30b42dbb6c91.slice/crio-948618003b95dbfc8368e471ce399b796f03578a5347653e9228ef78f02d2ce1\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod801e91e9_f527_41e3_9468_9ec9e9ec8f3c.slice/crio-4426b54f84a837fd86eaba7c242d64b2a2c3ca6628dd96efd9d15a39607dad21.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c1400af_1c32_4f74_89f8_30b42dbb6c91.slice/crio-conmon-e067161ef4269cdad93881fc154e79b54eb2dd197ccd2d0bc4ef018370a84f91.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fcdde4b_0742_40b3_8f98_41b218f6476a.slice/crio-e5c59fca3881f44d47d647801b6d60dc20440f97bab558f11dfac93f5c5e1684\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fcdde4b_0742_40b3_8f98_41b218f6476a.slice/crio-45f1332b80b2ead5b8f1bd268273ede6bc21d2179b011b1bc40478a84c9a0db2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c32e828_bea4_4a05_9492_31124e2964e1.slice/crio-conmon-6b34d9dafd9eee6f723fa4716e1157b53d5ae185f505afa8251c3edbe9e923aa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod801e91e9_f527_41e3_9468_9ec9e9ec8f3c.slice/crio-conmon-4426b54f84a837fd86eaba7c242d64b2a2c3ca6628dd96efd9d15a39607dad21.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6377f80e_0b32_479e_b33c_fc4d9f67b299.slice/crio-c5e09565803f99d513ec3b1bec6643c2ed0b2895e20fa36714ce5b30ddf905ca\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod801e91e9_f527_41e3_9468_9ec9e9ec8f3c.slice/crio-d6233f5f0aa104656ea27391cc344e2986118f551562125632b837160e46818c.scope\": RecentStats: unable to find data in memory cache]" Jan 28 07:08:48 crc kubenswrapper[4776]: I0128 07:08:48.768995 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:08:48 crc kubenswrapper[4776]: I0128 07:08:48.941365 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c32e828-bea4-4a05-9492-31124e2964e1-logs\") pod \"2c32e828-bea4-4a05-9492-31124e2964e1\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " Jan 28 07:08:48 crc kubenswrapper[4776]: I0128 07:08:48.941460 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c32e828-bea4-4a05-9492-31124e2964e1-config-data\") pod \"2c32e828-bea4-4a05-9492-31124e2964e1\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " Jan 28 07:08:48 crc kubenswrapper[4776]: I0128 07:08:48.941600 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c32e828-bea4-4a05-9492-31124e2964e1-scripts\") pod \"2c32e828-bea4-4a05-9492-31124e2964e1\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " Jan 28 07:08:48 crc kubenswrapper[4776]: I0128 07:08:48.941659 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtst6\" (UniqueName: \"kubernetes.io/projected/2c32e828-bea4-4a05-9492-31124e2964e1-kube-api-access-mtst6\") pod \"2c32e828-bea4-4a05-9492-31124e2964e1\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " Jan 28 07:08:48 crc kubenswrapper[4776]: I0128 07:08:48.941732 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c32e828-bea4-4a05-9492-31124e2964e1-horizon-secret-key\") pod \"2c32e828-bea4-4a05-9492-31124e2964e1\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " Jan 28 07:08:48 crc kubenswrapper[4776]: I0128 07:08:48.942436 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c32e828-bea4-4a05-9492-31124e2964e1-logs" (OuterVolumeSpecName: "logs") pod "2c32e828-bea4-4a05-9492-31124e2964e1" (UID: "2c32e828-bea4-4a05-9492-31124e2964e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:08:48 crc kubenswrapper[4776]: I0128 07:08:48.942835 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c32e828-bea4-4a05-9492-31124e2964e1-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:48 crc kubenswrapper[4776]: I0128 07:08:48.951675 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c32e828-bea4-4a05-9492-31124e2964e1-kube-api-access-mtst6" (OuterVolumeSpecName: "kube-api-access-mtst6") pod "2c32e828-bea4-4a05-9492-31124e2964e1" (UID: "2c32e828-bea4-4a05-9492-31124e2964e1"). InnerVolumeSpecName "kube-api-access-mtst6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:48 crc kubenswrapper[4776]: I0128 07:08:48.953875 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c32e828-bea4-4a05-9492-31124e2964e1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2c32e828-bea4-4a05-9492-31124e2964e1" (UID: "2c32e828-bea4-4a05-9492-31124e2964e1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:49 crc kubenswrapper[4776]: E0128 07:08:49.006745 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2c32e828-bea4-4a05-9492-31124e2964e1-scripts podName:2c32e828-bea4-4a05-9492-31124e2964e1 nodeName:}" failed. No retries permitted until 2026-01-28 07:08:49.506527721 +0000 UTC m=+1100.922187881 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "scripts" (UniqueName: "kubernetes.io/configmap/2c32e828-bea4-4a05-9492-31124e2964e1-scripts") pod "2c32e828-bea4-4a05-9492-31124e2964e1" (UID: "2c32e828-bea4-4a05-9492-31124e2964e1") : error deleting /var/lib/kubelet/pods/2c32e828-bea4-4a05-9492-31124e2964e1/volume-subpaths: remove /var/lib/kubelet/pods/2c32e828-bea4-4a05-9492-31124e2964e1/volume-subpaths: no such file or directory Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.007487 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c32e828-bea4-4a05-9492-31124e2964e1-config-data" (OuterVolumeSpecName: "config-data") pod "2c32e828-bea4-4a05-9492-31124e2964e1" (UID: "2c32e828-bea4-4a05-9492-31124e2964e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.044883 4776 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c32e828-bea4-4a05-9492-31124e2964e1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.045309 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c32e828-bea4-4a05-9492-31124e2964e1-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.045323 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtst6\" (UniqueName: \"kubernetes.io/projected/2c32e828-bea4-4a05-9492-31124e2964e1-kube-api-access-mtst6\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.047750 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.251541 4776 generic.go:334] "Generic (PLEG): container finished" podID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerID="43accac2de5072edeb03536a049dce90caf20f7eff3b337ec8fbdc4c6b66f285" exitCode=0 Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.251694 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801e91e9-f527-41e3-9468-9ec9e9ec8f3c","Type":"ContainerDied","Data":"43accac2de5072edeb03536a049dce90caf20f7eff3b337ec8fbdc4c6b66f285"} Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.256008 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" event={"ID":"3092241e-a9e3-4c51-b31b-36eae29a52e1","Type":"ContainerStarted","Data":"35b686d1ad45c4a10334ef53666971e30e8f228e4dd9ed8a6e4a57b8d02487f6"} Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.256040 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" event={"ID":"3092241e-a9e3-4c51-b31b-36eae29a52e1","Type":"ContainerStarted","Data":"8370a9b0e80c00c00a40d2a0302f69bb55f6b7fcd3ef80d694efafef5473d181"} Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.260617 4776 generic.go:334] "Generic (PLEG): container finished" podID="2c32e828-bea4-4a05-9492-31124e2964e1" containerID="6b34d9dafd9eee6f723fa4716e1157b53d5ae185f505afa8251c3edbe9e923aa" exitCode=137 Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.260666 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f66fd5db5-sd687" event={"ID":"2c32e828-bea4-4a05-9492-31124e2964e1","Type":"ContainerDied","Data":"6b34d9dafd9eee6f723fa4716e1157b53d5ae185f505afa8251c3edbe9e923aa"} Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.260685 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f66fd5db5-sd687" event={"ID":"2c32e828-bea4-4a05-9492-31124e2964e1","Type":"ContainerDied","Data":"2e2e2fc23192afb8d74afd5af165273c5e62e2fa3abf365ed5a96a89edb65c39"} Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.260701 4776 scope.go:117] "RemoveContainer" containerID="6b34d9dafd9eee6f723fa4716e1157b53d5ae185f505afa8251c3edbe9e923aa" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.260794 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f66fd5db5-sd687" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.270687 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"57d343c3-38e8-4585-9d8b-84410907752f","Type":"ContainerStarted","Data":"0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f"} Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.270983 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.285091 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6f446b9874-8rzlp" podStartSLOduration=8.587491146 podStartE2EDuration="11.285076799s" podCreationTimestamp="2026-01-28 07:08:38 +0000 UTC" firstStartedPulling="2026-01-28 07:08:45.268352156 +0000 UTC m=+1096.684012316" lastFinishedPulling="2026-01-28 07:08:47.965937809 +0000 UTC m=+1099.381597969" observedRunningTime="2026-01-28 07:08:49.282828828 +0000 UTC m=+1100.698488988" watchObservedRunningTime="2026-01-28 07:08:49.285076799 +0000 UTC m=+1100.700736959" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.292354 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" event={"ID":"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd","Type":"ContainerStarted","Data":"94e09a9c1472e2917b8bbddf28d644810bebee8c18193b3624497336fae60209"} Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.292710 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.303704 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0c200544-1881-4bdf-9e84-78ff9ccd8712","Type":"ContainerStarted","Data":"f22d6d0917682bf84ee80b0f2006fa363a35d242b171283ad76c91aa3e0c391a"} Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.363977 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5cc6b97df7-r6gfn" event={"ID":"87270d72-c59e-4526-b69b-ceaebfb13fdd","Type":"ContainerStarted","Data":"25c75c5d0e27fa8de5c71baceb4e43dad52dde26145f1eb15f09fb3252836acd"} Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.364040 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-659688b465-m49kr" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.364050 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5cc6b97df7-r6gfn" event={"ID":"87270d72-c59e-4526-b69b-ceaebfb13fdd","Type":"ContainerStarted","Data":"7962e09d392a9d9107ac9410628485c76e4f13511a46c05d0dbd1c157386e7ea"} Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.400826 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.400804504 podStartE2EDuration="4.400804504s" podCreationTimestamp="2026-01-28 07:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:49.336487446 +0000 UTC m=+1100.752147606" watchObservedRunningTime="2026-01-28 07:08:49.400804504 +0000 UTC m=+1100.816464664" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.429595 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.442735 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" podStartSLOduration=4.442710972 podStartE2EDuration="4.442710972s" podCreationTimestamp="2026-01-28 07:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:49.420070327 +0000 UTC m=+1100.835730487" watchObservedRunningTime="2026-01-28 07:08:49.442710972 +0000 UTC m=+1100.858371132" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.468512 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5cc6b97df7-r6gfn" podStartSLOduration=8.526659394 podStartE2EDuration="11.468496673s" podCreationTimestamp="2026-01-28 07:08:38 +0000 UTC" firstStartedPulling="2026-01-28 07:08:45.027462071 +0000 UTC m=+1096.443122231" lastFinishedPulling="2026-01-28 07:08:47.96929935 +0000 UTC m=+1099.384959510" observedRunningTime="2026-01-28 07:08:49.460223978 +0000 UTC m=+1100.875884138" watchObservedRunningTime="2026-01-28 07:08:49.468496673 +0000 UTC m=+1100.884156823" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.557382 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c32e828-bea4-4a05-9492-31124e2964e1-scripts\") pod \"2c32e828-bea4-4a05-9492-31124e2964e1\" (UID: \"2c32e828-bea4-4a05-9492-31124e2964e1\") " Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.558462 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c32e828-bea4-4a05-9492-31124e2964e1-scripts" (OuterVolumeSpecName: "scripts") pod "2c32e828-bea4-4a05-9492-31124e2964e1" (UID: "2c32e828-bea4-4a05-9492-31124e2964e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.567838 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.600751 4776 scope.go:117] "RemoveContainer" containerID="8931943c83fbf0139c973a9c37200ccae6895d638d5e1560eb0be4b1bb1fda3e" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.617646 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f66fd5db5-sd687"] Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.628650 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-f66fd5db5-sd687"] Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.661611 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c32e828-bea4-4a05-9492-31124e2964e1-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.695259 4776 scope.go:117] "RemoveContainer" containerID="6b34d9dafd9eee6f723fa4716e1157b53d5ae185f505afa8251c3edbe9e923aa" Jan 28 07:08:49 crc kubenswrapper[4776]: E0128 07:08:49.697892 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b34d9dafd9eee6f723fa4716e1157b53d5ae185f505afa8251c3edbe9e923aa\": container with ID starting with 6b34d9dafd9eee6f723fa4716e1157b53d5ae185f505afa8251c3edbe9e923aa not found: ID does not exist" containerID="6b34d9dafd9eee6f723fa4716e1157b53d5ae185f505afa8251c3edbe9e923aa" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.697925 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b34d9dafd9eee6f723fa4716e1157b53d5ae185f505afa8251c3edbe9e923aa"} err="failed to get container status \"6b34d9dafd9eee6f723fa4716e1157b53d5ae185f505afa8251c3edbe9e923aa\": rpc error: code = NotFound desc = could not find container \"6b34d9dafd9eee6f723fa4716e1157b53d5ae185f505afa8251c3edbe9e923aa\": container with ID starting with 6b34d9dafd9eee6f723fa4716e1157b53d5ae185f505afa8251c3edbe9e923aa not found: ID does not exist" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.697948 4776 scope.go:117] "RemoveContainer" containerID="8931943c83fbf0139c973a9c37200ccae6895d638d5e1560eb0be4b1bb1fda3e" Jan 28 07:08:49 crc kubenswrapper[4776]: E0128 07:08:49.699174 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8931943c83fbf0139c973a9c37200ccae6895d638d5e1560eb0be4b1bb1fda3e\": container with ID starting with 8931943c83fbf0139c973a9c37200ccae6895d638d5e1560eb0be4b1bb1fda3e not found: ID does not exist" containerID="8931943c83fbf0139c973a9c37200ccae6895d638d5e1560eb0be4b1bb1fda3e" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.699209 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8931943c83fbf0139c973a9c37200ccae6895d638d5e1560eb0be4b1bb1fda3e"} err="failed to get container status \"8931943c83fbf0139c973a9c37200ccae6895d638d5e1560eb0be4b1bb1fda3e\": rpc error: code = NotFound desc = could not find container \"8931943c83fbf0139c973a9c37200ccae6895d638d5e1560eb0be4b1bb1fda3e\": container with ID starting with 8931943c83fbf0139c973a9c37200ccae6895d638d5e1560eb0be4b1bb1fda3e not found: ID does not exist" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.768216 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.868517 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-combined-ca-bundle\") pod \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.868702 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-run-httpd\") pod \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.868732 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-config-data\") pod \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.868771 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-log-httpd\") pod \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.868813 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p4kg\" (UniqueName: \"kubernetes.io/projected/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-kube-api-access-6p4kg\") pod \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.868863 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-sg-core-conf-yaml\") pod \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.868922 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-scripts\") pod \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\" (UID: \"801e91e9-f527-41e3-9468-9ec9e9ec8f3c\") " Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.870868 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "801e91e9-f527-41e3-9468-9ec9e9ec8f3c" (UID: "801e91e9-f527-41e3-9468-9ec9e9ec8f3c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.875321 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-scripts" (OuterVolumeSpecName: "scripts") pod "801e91e9-f527-41e3-9468-9ec9e9ec8f3c" (UID: "801e91e9-f527-41e3-9468-9ec9e9ec8f3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.875359 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "801e91e9-f527-41e3-9468-9ec9e9ec8f3c" (UID: "801e91e9-f527-41e3-9468-9ec9e9ec8f3c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.875608 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-kube-api-access-6p4kg" (OuterVolumeSpecName: "kube-api-access-6p4kg") pod "801e91e9-f527-41e3-9468-9ec9e9ec8f3c" (UID: "801e91e9-f527-41e3-9468-9ec9e9ec8f3c"). InnerVolumeSpecName "kube-api-access-6p4kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.952632 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "801e91e9-f527-41e3-9468-9ec9e9ec8f3c" (UID: "801e91e9-f527-41e3-9468-9ec9e9ec8f3c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.971146 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.971179 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.971188 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p4kg\" (UniqueName: \"kubernetes.io/projected/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-kube-api-access-6p4kg\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.971199 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:49 crc kubenswrapper[4776]: I0128 07:08:49.971208 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.037483 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "801e91e9-f527-41e3-9468-9ec9e9ec8f3c" (UID: "801e91e9-f527-41e3-9468-9ec9e9ec8f3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.072781 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.078675 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-config-data" (OuterVolumeSpecName: "config-data") pod "801e91e9-f527-41e3-9468-9ec9e9ec8f3c" (UID: "801e91e9-f527-41e3-9468-9ec9e9ec8f3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.082741 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.082957 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="09beb1cc-598c-406f-8121-acb8ece8e21c" containerName="watcher-api-log" containerID="cri-o://6277e5a11cbe6bb596bb46f6e9669ea8ebcd658adf9df5f05f5e6a09a859fab7" gracePeriod=30 Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.083090 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="09beb1cc-598c-406f-8121-acb8ece8e21c" containerName="watcher-api" containerID="cri-o://c6b1de7ac5d848c3dbeff5c9dcea6a3f2dc78af3d0c41b8d06be4d99eb4d3e70" gracePeriod=30 Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.174835 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801e91e9-f527-41e3-9468-9ec9e9ec8f3c-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.355276 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0c200544-1881-4bdf-9e84-78ff9ccd8712","Type":"ContainerStarted","Data":"33048e7ea047aed28f23683bfdfb07e14ab44c506ed91098917b6c2f2c3603df"} Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.366036 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801e91e9-f527-41e3-9468-9ec9e9ec8f3c","Type":"ContainerDied","Data":"82dbdba568082e1d4c8a50ab75ff9b3e5d2a65251163f2e7c2e32306fb162f75"} Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.366094 4776 scope.go:117] "RemoveContainer" containerID="98313f27f6ba916df3d40025fdc0eabe636c7163a08080039d8d250f38bf89cf" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.366206 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.372731 4776 generic.go:334] "Generic (PLEG): container finished" podID="09beb1cc-598c-406f-8121-acb8ece8e21c" containerID="6277e5a11cbe6bb596bb46f6e9669ea8ebcd658adf9df5f05f5e6a09a859fab7" exitCode=143 Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.372792 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"09beb1cc-598c-406f-8121-acb8ece8e21c","Type":"ContainerDied","Data":"6277e5a11cbe6bb596bb46f6e9669ea8ebcd658adf9df5f05f5e6a09a859fab7"} Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.410997 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.7876108630000003 podStartE2EDuration="5.41097609s" podCreationTimestamp="2026-01-28 07:08:45 +0000 UTC" firstStartedPulling="2026-01-28 07:08:46.374196621 +0000 UTC m=+1097.789856781" lastFinishedPulling="2026-01-28 07:08:47.997561848 +0000 UTC m=+1099.413222008" observedRunningTime="2026-01-28 07:08:50.390447814 +0000 UTC m=+1101.806107974" watchObservedRunningTime="2026-01-28 07:08:50.41097609 +0000 UTC m=+1101.826636250" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.415791 4776 scope.go:117] "RemoveContainer" containerID="d6233f5f0aa104656ea27391cc344e2986118f551562125632b837160e46818c" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.446709 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.447705 4776 scope.go:117] "RemoveContainer" containerID="43accac2de5072edeb03536a049dce90caf20f7eff3b337ec8fbdc4c6b66f285" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.462706 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.486500 4776 scope.go:117] "RemoveContainer" containerID="4426b54f84a837fd86eaba7c242d64b2a2c3ca6628dd96efd9d15a39607dad21" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.487418 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:08:50 crc kubenswrapper[4776]: E0128 07:08:50.488326 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c32e828-bea4-4a05-9492-31124e2964e1" containerName="horizon" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.488346 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c32e828-bea4-4a05-9492-31124e2964e1" containerName="horizon" Jan 28 07:08:50 crc kubenswrapper[4776]: E0128 07:08:50.488379 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerName="proxy-httpd" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.488386 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerName="proxy-httpd" Jan 28 07:08:50 crc kubenswrapper[4776]: E0128 07:08:50.488398 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c32e828-bea4-4a05-9492-31124e2964e1" containerName="horizon-log" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.488403 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c32e828-bea4-4a05-9492-31124e2964e1" containerName="horizon-log" Jan 28 07:08:50 crc kubenswrapper[4776]: E0128 07:08:50.488416 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerName="ceilometer-notification-agent" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.488425 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerName="ceilometer-notification-agent" Jan 28 07:08:50 crc kubenswrapper[4776]: E0128 07:08:50.488442 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerName="ceilometer-central-agent" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.488451 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerName="ceilometer-central-agent" Jan 28 07:08:50 crc kubenswrapper[4776]: E0128 07:08:50.488465 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerName="sg-core" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.488473 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerName="sg-core" Jan 28 07:08:50 crc kubenswrapper[4776]: E0128 07:08:50.488486 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fca1f5d-ea85-428d-ade1-b42d48e9718c" containerName="init" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.488491 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fca1f5d-ea85-428d-ade1-b42d48e9718c" containerName="init" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.488732 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerName="proxy-httpd" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.488756 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fca1f5d-ea85-428d-ade1-b42d48e9718c" containerName="init" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.488769 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c32e828-bea4-4a05-9492-31124e2964e1" containerName="horizon" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.488784 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerName="ceilometer-central-agent" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.488800 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerName="sg-core" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.488817 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" containerName="ceilometer-notification-agent" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.488827 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c32e828-bea4-4a05-9492-31124e2964e1" containerName="horizon-log" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.493849 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.493938 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.498756 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.502579 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.506888 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.582425 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-log-httpd\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.582489 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.582525 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-config-data\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.582636 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-run-httpd\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.582674 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzxqt\" (UniqueName: \"kubernetes.io/projected/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-kube-api-access-pzxqt\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.582821 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.583021 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-scripts\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.685190 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.685321 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-scripts\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.685388 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-log-httpd\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.685474 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.685511 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-config-data\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.685570 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-run-httpd\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.685595 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzxqt\" (UniqueName: \"kubernetes.io/projected/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-kube-api-access-pzxqt\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.690944 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-config-data\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.691308 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-run-httpd\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.691626 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-log-httpd\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.692031 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.695395 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.701474 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-scripts\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.712471 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzxqt\" (UniqueName: \"kubernetes.io/projected/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-kube-api-access-pzxqt\") pod \"ceilometer-0\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " pod="openstack/ceilometer-0" Jan 28 07:08:50 crc kubenswrapper[4776]: I0128 07:08:50.837398 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:08:51 crc kubenswrapper[4776]: I0128 07:08:51.316710 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c32e828-bea4-4a05-9492-31124e2964e1" path="/var/lib/kubelet/pods/2c32e828-bea4-4a05-9492-31124e2964e1/volumes" Jan 28 07:08:51 crc kubenswrapper[4776]: I0128 07:08:51.317730 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="801e91e9-f527-41e3-9468-9ec9e9ec8f3c" path="/var/lib/kubelet/pods/801e91e9-f527-41e3-9468-9ec9e9ec8f3c/volumes" Jan 28 07:08:51 crc kubenswrapper[4776]: I0128 07:08:51.358658 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:08:51 crc kubenswrapper[4776]: I0128 07:08:51.391716 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"521b11a5-ef9c-4f72-94c9-6ea74c3f687b","Type":"ContainerStarted","Data":"0b23a08b5d9ae645c69c77818c33a9dd6256a47795bb8354a27291357cceea98"} Jan 28 07:08:51 crc kubenswrapper[4776]: I0128 07:08:51.392902 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="57d343c3-38e8-4585-9d8b-84410907752f" containerName="cinder-api-log" containerID="cri-o://199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab" gracePeriod=30 Jan 28 07:08:51 crc kubenswrapper[4776]: I0128 07:08:51.393947 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="57d343c3-38e8-4585-9d8b-84410907752f" containerName="cinder-api" containerID="cri-o://0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f" gracePeriod=30 Jan 28 07:08:51 crc kubenswrapper[4776]: I0128 07:08:51.611839 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:08:51 crc kubenswrapper[4776]: I0128 07:08:51.880960 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-755fdfc784-krn2x" Jan 28 07:08:51 crc kubenswrapper[4776]: I0128 07:08:51.954180 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c7f79f5b8-2xn7l"] Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.121852 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.217564 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-config-data-custom\") pod \"57d343c3-38e8-4585-9d8b-84410907752f\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.217620 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-scripts\") pod \"57d343c3-38e8-4585-9d8b-84410907752f\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.217683 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57d343c3-38e8-4585-9d8b-84410907752f-etc-machine-id\") pod \"57d343c3-38e8-4585-9d8b-84410907752f\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.217711 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57d343c3-38e8-4585-9d8b-84410907752f-logs\") pod \"57d343c3-38e8-4585-9d8b-84410907752f\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.217769 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57d343c3-38e8-4585-9d8b-84410907752f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "57d343c3-38e8-4585-9d8b-84410907752f" (UID: "57d343c3-38e8-4585-9d8b-84410907752f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.218054 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57d343c3-38e8-4585-9d8b-84410907752f-logs" (OuterVolumeSpecName: "logs") pod "57d343c3-38e8-4585-9d8b-84410907752f" (UID: "57d343c3-38e8-4585-9d8b-84410907752f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.218151 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zkxq\" (UniqueName: \"kubernetes.io/projected/57d343c3-38e8-4585-9d8b-84410907752f-kube-api-access-8zkxq\") pod \"57d343c3-38e8-4585-9d8b-84410907752f\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.218177 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-config-data\") pod \"57d343c3-38e8-4585-9d8b-84410907752f\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.218444 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-combined-ca-bundle\") pod \"57d343c3-38e8-4585-9d8b-84410907752f\" (UID: \"57d343c3-38e8-4585-9d8b-84410907752f\") " Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.219243 4776 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57d343c3-38e8-4585-9d8b-84410907752f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.219263 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57d343c3-38e8-4585-9d8b-84410907752f-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.222438 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-scripts" (OuterVolumeSpecName: "scripts") pod "57d343c3-38e8-4585-9d8b-84410907752f" (UID: "57d343c3-38e8-4585-9d8b-84410907752f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.222662 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57d343c3-38e8-4585-9d8b-84410907752f-kube-api-access-8zkxq" (OuterVolumeSpecName: "kube-api-access-8zkxq") pod "57d343c3-38e8-4585-9d8b-84410907752f" (UID: "57d343c3-38e8-4585-9d8b-84410907752f"). InnerVolumeSpecName "kube-api-access-8zkxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.224715 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "57d343c3-38e8-4585-9d8b-84410907752f" (UID: "57d343c3-38e8-4585-9d8b-84410907752f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.246827 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57d343c3-38e8-4585-9d8b-84410907752f" (UID: "57d343c3-38e8-4585-9d8b-84410907752f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.265229 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-config-data" (OuterVolumeSpecName: "config-data") pod "57d343c3-38e8-4585-9d8b-84410907752f" (UID: "57d343c3-38e8-4585-9d8b-84410907752f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.321075 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zkxq\" (UniqueName: \"kubernetes.io/projected/57d343c3-38e8-4585-9d8b-84410907752f-kube-api-access-8zkxq\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.321104 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.321114 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.321123 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.321132 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57d343c3-38e8-4585-9d8b-84410907752f-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.405895 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"521b11a5-ef9c-4f72-94c9-6ea74c3f687b","Type":"ContainerStarted","Data":"4ce6a80eba720f8d0f0e2a150c530cf0b1d47f34ce4a588b47f4dd8e4425cb3d"} Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.408733 4776 generic.go:334] "Generic (PLEG): container finished" podID="57d343c3-38e8-4585-9d8b-84410907752f" containerID="0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f" exitCode=0 Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.408763 4776 generic.go:334] "Generic (PLEG): container finished" podID="57d343c3-38e8-4585-9d8b-84410907752f" containerID="199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab" exitCode=143 Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.408949 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c7f79f5b8-2xn7l" podUID="e5180ed1-0d82-4c44-aed4-3f3a5b34af93" containerName="horizon-log" containerID="cri-o://97736af916b231252a9e5e9c2caf625fa76ea7906e2816fcdf86b6aad63ee7e3" gracePeriod=30 Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.409404 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"57d343c3-38e8-4585-9d8b-84410907752f","Type":"ContainerDied","Data":"0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f"} Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.409463 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"57d343c3-38e8-4585-9d8b-84410907752f","Type":"ContainerDied","Data":"199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab"} Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.409476 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"57d343c3-38e8-4585-9d8b-84410907752f","Type":"ContainerDied","Data":"febf7455145b4187d50e5224ae3853678d95942c87bac98ff2d2e76d5b1f437c"} Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.409492 4776 scope.go:117] "RemoveContainer" containerID="0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.409419 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.409511 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c7f79f5b8-2xn7l" podUID="e5180ed1-0d82-4c44-aed4-3f3a5b34af93" containerName="horizon" containerID="cri-o://71bf08e3ca2d98d43e5538be797edda0f652fb4b5f58ef46a467d0df066fe22d" gracePeriod=30 Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.440009 4776 scope.go:117] "RemoveContainer" containerID="199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.454148 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.469514 4776 scope.go:117] "RemoveContainer" containerID="0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.469618 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 28 07:08:52 crc kubenswrapper[4776]: E0128 07:08:52.470092 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f\": container with ID starting with 0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f not found: ID does not exist" containerID="0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.470119 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f"} err="failed to get container status \"0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f\": rpc error: code = NotFound desc = could not find container \"0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f\": container with ID starting with 0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f not found: ID does not exist" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.470139 4776 scope.go:117] "RemoveContainer" containerID="199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab" Jan 28 07:08:52 crc kubenswrapper[4776]: E0128 07:08:52.470622 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab\": container with ID starting with 199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab not found: ID does not exist" containerID="199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.470675 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab"} err="failed to get container status \"199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab\": rpc error: code = NotFound desc = could not find container \"199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab\": container with ID starting with 199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab not found: ID does not exist" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.470710 4776 scope.go:117] "RemoveContainer" containerID="0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.471060 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f"} err="failed to get container status \"0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f\": rpc error: code = NotFound desc = could not find container \"0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f\": container with ID starting with 0a57b9d57ce5b863de90d1be3cac2e2190fa3c667d8c7e28065e6c02b5172d8f not found: ID does not exist" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.471087 4776 scope.go:117] "RemoveContainer" containerID="199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.471382 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab"} err="failed to get container status \"199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab\": rpc error: code = NotFound desc = could not find container \"199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab\": container with ID starting with 199d6775c07028c95ed63c463ece3dcae72a0e4d56b132e668ac338a2d3e7fab not found: ID does not exist" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.476775 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 28 07:08:52 crc kubenswrapper[4776]: E0128 07:08:52.477148 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d343c3-38e8-4585-9d8b-84410907752f" containerName="cinder-api" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.477164 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d343c3-38e8-4585-9d8b-84410907752f" containerName="cinder-api" Jan 28 07:08:52 crc kubenswrapper[4776]: E0128 07:08:52.477192 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d343c3-38e8-4585-9d8b-84410907752f" containerName="cinder-api-log" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.477198 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d343c3-38e8-4585-9d8b-84410907752f" containerName="cinder-api-log" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.477366 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d343c3-38e8-4585-9d8b-84410907752f" containerName="cinder-api" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.477391 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d343c3-38e8-4585-9d8b-84410907752f" containerName="cinder-api-log" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.478560 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.480418 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.488911 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.488977 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.491135 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.526891 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.527000 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce856766-46b4-4498-9aa0-bdf8c0e946db-logs\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.527029 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.527211 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.528246 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce856766-46b4-4498-9aa0-bdf8c0e946db-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.528301 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf9k9\" (UniqueName: \"kubernetes.io/projected/ce856766-46b4-4498-9aa0-bdf8c0e946db-kube-api-access-cf9k9\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.528421 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-scripts\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.528451 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-config-data-custom\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.529316 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-config-data\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.631157 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-config-data\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.631258 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.631284 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce856766-46b4-4498-9aa0-bdf8c0e946db-logs\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.631311 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.631380 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.631423 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce856766-46b4-4498-9aa0-bdf8c0e946db-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.631463 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf9k9\" (UniqueName: \"kubernetes.io/projected/ce856766-46b4-4498-9aa0-bdf8c0e946db-kube-api-access-cf9k9\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.631506 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-scripts\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.631529 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-config-data-custom\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.632024 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce856766-46b4-4498-9aa0-bdf8c0e946db-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.632965 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce856766-46b4-4498-9aa0-bdf8c0e946db-logs\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.637671 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.650641 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-config-data-custom\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.651113 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-config-data\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.652443 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-scripts\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.653585 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.653776 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce856766-46b4-4498-9aa0-bdf8c0e946db-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.661233 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf9k9\" (UniqueName: \"kubernetes.io/projected/ce856766-46b4-4498-9aa0-bdf8c0e946db-kube-api-access-cf9k9\") pod \"cinder-api-0\" (UID: \"ce856766-46b4-4498-9aa0-bdf8c0e946db\") " pod="openstack/cinder-api-0" Jan 28 07:08:52 crc kubenswrapper[4776]: I0128 07:08:52.810777 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.003488 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.149479 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55df755858-s7sbs" Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.201420 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5f8bdb5bcd-hfpbh"] Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.201683 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" podUID="67280a26-5583-4b51-b74d-c3e8b2ea6645" containerName="barbican-api-log" containerID="cri-o://e4fbff98c2922fc9943bdbc500cd81eb6b0067debf0fa2f64e36da2689edad94" gracePeriod=30 Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.202112 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" podUID="67280a26-5583-4b51-b74d-c3e8b2ea6645" containerName="barbican-api" containerID="cri-o://23fbf1c9785b9a94be009a7433f7b499907b6ba866fb9ebe80c4bf80e8e0acc8" gracePeriod=30 Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.219456 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" podUID="67280a26-5583-4b51-b74d-c3e8b2ea6645" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.177:9311/healthcheck\": EOF" Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.219471 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" podUID="67280a26-5583-4b51-b74d-c3e8b2ea6645" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.177:9311/healthcheck\": EOF" Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.250255 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="09beb1cc-598c-406f-8121-acb8ece8e21c" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.170:9322/\": read tcp 10.217.0.2:57570->10.217.0.170:9322: read: connection reset by peer" Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.250278 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="09beb1cc-598c-406f-8121-acb8ece8e21c" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.170:9322/\": read tcp 10.217.0.2:57562->10.217.0.170:9322: read: connection reset by peer" Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.319389 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57d343c3-38e8-4585-9d8b-84410907752f" path="/var/lib/kubelet/pods/57d343c3-38e8-4585-9d8b-84410907752f/volumes" Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.353734 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.421346 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"521b11a5-ef9c-4f72-94c9-6ea74c3f687b","Type":"ContainerStarted","Data":"7bcbe97d50dfb8d2eed7c5cdc47883a6e058b6fb2a3d9e3b47a75dce8b63edce"} Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.423561 4776 generic.go:334] "Generic (PLEG): container finished" podID="09beb1cc-598c-406f-8121-acb8ece8e21c" containerID="c6b1de7ac5d848c3dbeff5c9dcea6a3f2dc78af3d0c41b8d06be4d99eb4d3e70" exitCode=0 Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.423618 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"09beb1cc-598c-406f-8121-acb8ece8e21c","Type":"ContainerDied","Data":"c6b1de7ac5d848c3dbeff5c9dcea6a3f2dc78af3d0c41b8d06be4d99eb4d3e70"} Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.427938 4776 generic.go:334] "Generic (PLEG): container finished" podID="67280a26-5583-4b51-b74d-c3e8b2ea6645" containerID="e4fbff98c2922fc9943bdbc500cd81eb6b0067debf0fa2f64e36da2689edad94" exitCode=143 Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.428002 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" event={"ID":"67280a26-5583-4b51-b74d-c3e8b2ea6645","Type":"ContainerDied","Data":"e4fbff98c2922fc9943bdbc500cd81eb6b0067debf0fa2f64e36da2689edad94"} Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.432823 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce856766-46b4-4498-9aa0-bdf8c0e946db","Type":"ContainerStarted","Data":"2b0f8631d731a1c8c34579d4090043a7b72684aa0ac0f268d79acd3f210f9ad7"} Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.843749 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.986670 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-config-data\") pod \"09beb1cc-598c-406f-8121-acb8ece8e21c\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.987203 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znzrv\" (UniqueName: \"kubernetes.io/projected/09beb1cc-598c-406f-8121-acb8ece8e21c-kube-api-access-znzrv\") pod \"09beb1cc-598c-406f-8121-acb8ece8e21c\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.987737 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09beb1cc-598c-406f-8121-acb8ece8e21c-logs\") pod \"09beb1cc-598c-406f-8121-acb8ece8e21c\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.987763 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-combined-ca-bundle\") pod \"09beb1cc-598c-406f-8121-acb8ece8e21c\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.987795 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-custom-prometheus-ca\") pod \"09beb1cc-598c-406f-8121-acb8ece8e21c\" (UID: \"09beb1cc-598c-406f-8121-acb8ece8e21c\") " Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.988410 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09beb1cc-598c-406f-8121-acb8ece8e21c-logs" (OuterVolumeSpecName: "logs") pod "09beb1cc-598c-406f-8121-acb8ece8e21c" (UID: "09beb1cc-598c-406f-8121-acb8ece8e21c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:08:53 crc kubenswrapper[4776]: I0128 07:08:53.994253 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09beb1cc-598c-406f-8121-acb8ece8e21c-kube-api-access-znzrv" (OuterVolumeSpecName: "kube-api-access-znzrv") pod "09beb1cc-598c-406f-8121-acb8ece8e21c" (UID: "09beb1cc-598c-406f-8121-acb8ece8e21c"). InnerVolumeSpecName "kube-api-access-znzrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.042737 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "09beb1cc-598c-406f-8121-acb8ece8e21c" (UID: "09beb1cc-598c-406f-8121-acb8ece8e21c"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.053499 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09beb1cc-598c-406f-8121-acb8ece8e21c" (UID: "09beb1cc-598c-406f-8121-acb8ece8e21c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.071052 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-config-data" (OuterVolumeSpecName: "config-data") pod "09beb1cc-598c-406f-8121-acb8ece8e21c" (UID: "09beb1cc-598c-406f-8121-acb8ece8e21c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.090637 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09beb1cc-598c-406f-8121-acb8ece8e21c-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.090986 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.091067 4776 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.091125 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09beb1cc-598c-406f-8121-acb8ece8e21c-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.108706 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znzrv\" (UniqueName: \"kubernetes.io/projected/09beb1cc-598c-406f-8121-acb8ece8e21c-kube-api-access-znzrv\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.446754 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"521b11a5-ef9c-4f72-94c9-6ea74c3f687b","Type":"ContainerStarted","Data":"a6c345e15975fc0ea1f2d97db3df8e1cfce38515f9b1553da0ef391a6afece8b"} Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.448719 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"09beb1cc-598c-406f-8121-acb8ece8e21c","Type":"ContainerDied","Data":"dbfc812e8fa79f252fc1fc83420dc31b45a07763edeb72e746272e47346b0043"} Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.448767 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.448847 4776 scope.go:117] "RemoveContainer" containerID="c6b1de7ac5d848c3dbeff5c9dcea6a3f2dc78af3d0c41b8d06be4d99eb4d3e70" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.451748 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce856766-46b4-4498-9aa0-bdf8c0e946db","Type":"ContainerStarted","Data":"7d996ae6d63735c3f4391fa6d8ee39bb34d265e3143ed15de790ce82c84afe58"} Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.503781 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.514686 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.529727 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Jan 28 07:08:54 crc kubenswrapper[4776]: E0128 07:08:54.530159 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09beb1cc-598c-406f-8121-acb8ece8e21c" containerName="watcher-api-log" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.530174 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="09beb1cc-598c-406f-8121-acb8ece8e21c" containerName="watcher-api-log" Jan 28 07:08:54 crc kubenswrapper[4776]: E0128 07:08:54.530188 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09beb1cc-598c-406f-8121-acb8ece8e21c" containerName="watcher-api" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.530193 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="09beb1cc-598c-406f-8121-acb8ece8e21c" containerName="watcher-api" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.530362 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="09beb1cc-598c-406f-8121-acb8ece8e21c" containerName="watcher-api" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.530377 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="09beb1cc-598c-406f-8121-acb8ece8e21c" containerName="watcher-api-log" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.532814 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.539888 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.540147 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.541344 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.543714 4776 scope.go:117] "RemoveContainer" containerID="6277e5a11cbe6bb596bb46f6e9669ea8ebcd658adf9df5f05f5e6a09a859fab7" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.544474 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.618746 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c06de04-7886-4696-8416-3559c16a5f7f-config-data\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.618829 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c06de04-7886-4696-8416-3559c16a5f7f-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.619046 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c06de04-7886-4696-8416-3559c16a5f7f-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.619137 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c06de04-7886-4696-8416-3559c16a5f7f-logs\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.619243 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8c06de04-7886-4696-8416-3559c16a5f7f-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.619300 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c06de04-7886-4696-8416-3559c16a5f7f-public-tls-certs\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.619387 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6cbb\" (UniqueName: \"kubernetes.io/projected/8c06de04-7886-4696-8416-3559c16a5f7f-kube-api-access-k6cbb\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.720902 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8c06de04-7886-4696-8416-3559c16a5f7f-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.720956 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c06de04-7886-4696-8416-3559c16a5f7f-public-tls-certs\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.722153 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6cbb\" (UniqueName: \"kubernetes.io/projected/8c06de04-7886-4696-8416-3559c16a5f7f-kube-api-access-k6cbb\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.722358 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c06de04-7886-4696-8416-3559c16a5f7f-config-data\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.722452 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c06de04-7886-4696-8416-3559c16a5f7f-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.723371 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c06de04-7886-4696-8416-3559c16a5f7f-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.723471 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c06de04-7886-4696-8416-3559c16a5f7f-logs\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.724565 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c06de04-7886-4696-8416-3559c16a5f7f-logs\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.724702 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c06de04-7886-4696-8416-3559c16a5f7f-public-tls-certs\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.730047 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c06de04-7886-4696-8416-3559c16a5f7f-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.730518 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c06de04-7886-4696-8416-3559c16a5f7f-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.731531 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c06de04-7886-4696-8416-3559c16a5f7f-config-data\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.733529 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8c06de04-7886-4696-8416-3559c16a5f7f-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.748116 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6cbb\" (UniqueName: \"kubernetes.io/projected/8c06de04-7886-4696-8416-3559c16a5f7f-kube-api-access-k6cbb\") pod \"watcher-api-0\" (UID: \"8c06de04-7886-4696-8416-3559c16a5f7f\") " pod="openstack/watcher-api-0" Jan 28 07:08:54 crc kubenswrapper[4776]: I0128 07:08:54.884988 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 28 07:08:55 crc kubenswrapper[4776]: I0128 07:08:55.319599 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09beb1cc-598c-406f-8121-acb8ece8e21c" path="/var/lib/kubelet/pods/09beb1cc-598c-406f-8121-acb8ece8e21c/volumes" Jan 28 07:08:55 crc kubenswrapper[4776]: I0128 07:08:55.363164 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Jan 28 07:08:55 crc kubenswrapper[4776]: W0128 07:08:55.364411 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c06de04_7886_4696_8416_3559c16a5f7f.slice/crio-f6e546a177b2d06618df43ada045be7220fd1a25340d8636d77d1d171b5060e9 WatchSource:0}: Error finding container f6e546a177b2d06618df43ada045be7220fd1a25340d8636d77d1d171b5060e9: Status 404 returned error can't find the container with id f6e546a177b2d06618df43ada045be7220fd1a25340d8636d77d1d171b5060e9 Jan 28 07:08:55 crc kubenswrapper[4776]: I0128 07:08:55.471378 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce856766-46b4-4498-9aa0-bdf8c0e946db","Type":"ContainerStarted","Data":"b27e1a31f2c6fc82cdbd53eefc336e27fc9c47554c2c03e0a8de8a52b4836f4f"} Jan 28 07:08:55 crc kubenswrapper[4776]: I0128 07:08:55.471737 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 28 07:08:55 crc kubenswrapper[4776]: I0128 07:08:55.479889 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8c06de04-7886-4696-8416-3559c16a5f7f","Type":"ContainerStarted","Data":"f6e546a177b2d06618df43ada045be7220fd1a25340d8636d77d1d171b5060e9"} Jan 28 07:08:55 crc kubenswrapper[4776]: I0128 07:08:55.507965 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.5079459010000003 podStartE2EDuration="3.507945901s" podCreationTimestamp="2026-01-28 07:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:55.507917081 +0000 UTC m=+1106.923577241" watchObservedRunningTime="2026-01-28 07:08:55.507945901 +0000 UTC m=+1106.923606071" Jan 28 07:08:55 crc kubenswrapper[4776]: I0128 07:08:55.661726 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:08:55 crc kubenswrapper[4776]: I0128 07:08:55.778810 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-th8p7"] Jan 28 07:08:55 crc kubenswrapper[4776]: I0128 07:08:55.779254 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-th8p7" podUID="f5bc5428-35ca-44a5-8fa9-7d11ec4c6804" containerName="dnsmasq-dns" containerID="cri-o://daab897a06e28d9f56e0fcc5eaec130aa986a6718b293e9aa3748e1389645d10" gracePeriod=10 Jan 28 07:08:55 crc kubenswrapper[4776]: I0128 07:08:55.783835 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 28 07:08:55 crc kubenswrapper[4776]: I0128 07:08:55.841482 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.285055 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.360038 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-dns-svc\") pod \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.360091 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdz85\" (UniqueName: \"kubernetes.io/projected/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-kube-api-access-mdz85\") pod \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.360144 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-ovsdbserver-sb\") pod \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.360327 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-config\") pod \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.360350 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-ovsdbserver-nb\") pod \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.360375 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-dns-swift-storage-0\") pod \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\" (UID: \"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804\") " Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.364948 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-kube-api-access-mdz85" (OuterVolumeSpecName: "kube-api-access-mdz85") pod "f5bc5428-35ca-44a5-8fa9-7d11ec4c6804" (UID: "f5bc5428-35ca-44a5-8fa9-7d11ec4c6804"). InnerVolumeSpecName "kube-api-access-mdz85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.406183 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f5bc5428-35ca-44a5-8fa9-7d11ec4c6804" (UID: "f5bc5428-35ca-44a5-8fa9-7d11ec4c6804"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.417898 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f5bc5428-35ca-44a5-8fa9-7d11ec4c6804" (UID: "f5bc5428-35ca-44a5-8fa9-7d11ec4c6804"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.418733 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-config" (OuterVolumeSpecName: "config") pod "f5bc5428-35ca-44a5-8fa9-7d11ec4c6804" (UID: "f5bc5428-35ca-44a5-8fa9-7d11ec4c6804"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.419723 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f5bc5428-35ca-44a5-8fa9-7d11ec4c6804" (UID: "f5bc5428-35ca-44a5-8fa9-7d11ec4c6804"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.431853 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f5bc5428-35ca-44a5-8fa9-7d11ec4c6804" (UID: "f5bc5428-35ca-44a5-8fa9-7d11ec4c6804"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.463367 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.463405 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.463418 4776 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.463428 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.463436 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdz85\" (UniqueName: \"kubernetes.io/projected/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-kube-api-access-mdz85\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.463444 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.492478 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8c06de04-7886-4696-8416-3559c16a5f7f","Type":"ContainerStarted","Data":"10eb0dbd366df56aa64da154d1b0105789e13be288a7890314968f1dd02df359"} Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.494155 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.494266 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8c06de04-7886-4696-8416-3559c16a5f7f","Type":"ContainerStarted","Data":"bc2dce17e3646746371fef68b648c9520310eab497d58b58dc57393bc9d01dd5"} Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.495303 4776 generic.go:334] "Generic (PLEG): container finished" podID="f5bc5428-35ca-44a5-8fa9-7d11ec4c6804" containerID="daab897a06e28d9f56e0fcc5eaec130aa986a6718b293e9aa3748e1389645d10" exitCode=0 Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.495504 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-th8p7" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.495539 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-th8p7" event={"ID":"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804","Type":"ContainerDied","Data":"daab897a06e28d9f56e0fcc5eaec130aa986a6718b293e9aa3748e1389645d10"} Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.495809 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-th8p7" event={"ID":"f5bc5428-35ca-44a5-8fa9-7d11ec4c6804","Type":"ContainerDied","Data":"8acf1a481989aea13397267c3093efe2dd3358333ae599fef839a651142d1c9d"} Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.495829 4776 scope.go:117] "RemoveContainer" containerID="daab897a06e28d9f56e0fcc5eaec130aa986a6718b293e9aa3748e1389645d10" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.499525 4776 generic.go:334] "Generic (PLEG): container finished" podID="e5180ed1-0d82-4c44-aed4-3f3a5b34af93" containerID="71bf08e3ca2d98d43e5538be797edda0f652fb4b5f58ef46a467d0df066fe22d" exitCode=0 Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.499681 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c7f79f5b8-2xn7l" event={"ID":"e5180ed1-0d82-4c44-aed4-3f3a5b34af93","Type":"ContainerDied","Data":"71bf08e3ca2d98d43e5538be797edda0f652fb4b5f58ef46a467d0df066fe22d"} Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.504621 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"521b11a5-ef9c-4f72-94c9-6ea74c3f687b","Type":"ContainerStarted","Data":"41368e882e48f9d639a73241f1cb38230da29a960d7f1544dfb8f54639d4642e"} Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.504816 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0c200544-1881-4bdf-9e84-78ff9ccd8712" containerName="cinder-scheduler" containerID="cri-o://f22d6d0917682bf84ee80b0f2006fa363a35d242b171283ad76c91aa3e0c391a" gracePeriod=30 Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.504910 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0c200544-1881-4bdf-9e84-78ff9ccd8712" containerName="probe" containerID="cri-o://33048e7ea047aed28f23683bfdfb07e14ab44c506ed91098917b6c2f2c3603df" gracePeriod=30 Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.505177 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.526748 4776 scope.go:117] "RemoveContainer" containerID="862ec626b77c87bc8620eef80543e4517a17df465156fb0f95c57df13ce63fb6" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.530470 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.530445416 podStartE2EDuration="2.530445416s" podCreationTimestamp="2026-01-28 07:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:08:56.513497376 +0000 UTC m=+1107.929157546" watchObservedRunningTime="2026-01-28 07:08:56.530445416 +0000 UTC m=+1107.946105586" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.547171 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.430104049 podStartE2EDuration="6.547153489s" podCreationTimestamp="2026-01-28 07:08:50 +0000 UTC" firstStartedPulling="2026-01-28 07:08:51.372814841 +0000 UTC m=+1102.788475001" lastFinishedPulling="2026-01-28 07:08:55.489864281 +0000 UTC m=+1106.905524441" observedRunningTime="2026-01-28 07:08:56.539626734 +0000 UTC m=+1107.955286904" watchObservedRunningTime="2026-01-28 07:08:56.547153489 +0000 UTC m=+1107.962813659" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.561472 4776 scope.go:117] "RemoveContainer" containerID="daab897a06e28d9f56e0fcc5eaec130aa986a6718b293e9aa3748e1389645d10" Jan 28 07:08:56 crc kubenswrapper[4776]: E0128 07:08:56.561929 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daab897a06e28d9f56e0fcc5eaec130aa986a6718b293e9aa3748e1389645d10\": container with ID starting with daab897a06e28d9f56e0fcc5eaec130aa986a6718b293e9aa3748e1389645d10 not found: ID does not exist" containerID="daab897a06e28d9f56e0fcc5eaec130aa986a6718b293e9aa3748e1389645d10" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.561968 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daab897a06e28d9f56e0fcc5eaec130aa986a6718b293e9aa3748e1389645d10"} err="failed to get container status \"daab897a06e28d9f56e0fcc5eaec130aa986a6718b293e9aa3748e1389645d10\": rpc error: code = NotFound desc = could not find container \"daab897a06e28d9f56e0fcc5eaec130aa986a6718b293e9aa3748e1389645d10\": container with ID starting with daab897a06e28d9f56e0fcc5eaec130aa986a6718b293e9aa3748e1389645d10 not found: ID does not exist" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.561992 4776 scope.go:117] "RemoveContainer" containerID="862ec626b77c87bc8620eef80543e4517a17df465156fb0f95c57df13ce63fb6" Jan 28 07:08:56 crc kubenswrapper[4776]: E0128 07:08:56.565098 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"862ec626b77c87bc8620eef80543e4517a17df465156fb0f95c57df13ce63fb6\": container with ID starting with 862ec626b77c87bc8620eef80543e4517a17df465156fb0f95c57df13ce63fb6 not found: ID does not exist" containerID="862ec626b77c87bc8620eef80543e4517a17df465156fb0f95c57df13ce63fb6" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.565278 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"862ec626b77c87bc8620eef80543e4517a17df465156fb0f95c57df13ce63fb6"} err="failed to get container status \"862ec626b77c87bc8620eef80543e4517a17df465156fb0f95c57df13ce63fb6\": rpc error: code = NotFound desc = could not find container \"862ec626b77c87bc8620eef80543e4517a17df465156fb0f95c57df13ce63fb6\": container with ID starting with 862ec626b77c87bc8620eef80543e4517a17df465156fb0f95c57df13ce63fb6 not found: ID does not exist" Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.587676 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-th8p7"] Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.597760 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-th8p7"] Jan 28 07:08:56 crc kubenswrapper[4776]: I0128 07:08:56.944433 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c7f79f5b8-2xn7l" podUID="e5180ed1-0d82-4c44-aed4-3f3a5b34af93" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.160:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.160:8443: connect: connection refused" Jan 28 07:08:57 crc kubenswrapper[4776]: I0128 07:08:57.317491 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5bc5428-35ca-44a5-8fa9-7d11ec4c6804" path="/var/lib/kubelet/pods/f5bc5428-35ca-44a5-8fa9-7d11ec4c6804/volumes" Jan 28 07:08:57 crc kubenswrapper[4776]: I0128 07:08:57.522978 4776 generic.go:334] "Generic (PLEG): container finished" podID="0c200544-1881-4bdf-9e84-78ff9ccd8712" containerID="33048e7ea047aed28f23683bfdfb07e14ab44c506ed91098917b6c2f2c3603df" exitCode=0 Jan 28 07:08:57 crc kubenswrapper[4776]: I0128 07:08:57.523747 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0c200544-1881-4bdf-9e84-78ff9ccd8712","Type":"ContainerDied","Data":"33048e7ea047aed28f23683bfdfb07e14ab44c506ed91098917b6c2f2c3603df"} Jan 28 07:08:58 crc kubenswrapper[4776]: I0128 07:08:58.532587 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 07:08:58 crc kubenswrapper[4776]: I0128 07:08:58.646136 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" podUID="67280a26-5583-4b51-b74d-c3e8b2ea6645" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.177:9311/healthcheck\": read tcp 10.217.0.2:47642->10.217.0.177:9311: read: connection reset by peer" Jan 28 07:08:58 crc kubenswrapper[4776]: I0128 07:08:58.646140 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" podUID="67280a26-5583-4b51-b74d-c3e8b2ea6645" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.177:9311/healthcheck\": read tcp 10.217.0.2:47650->10.217.0.177:9311: read: connection reset by peer" Jan 28 07:08:58 crc kubenswrapper[4776]: I0128 07:08:58.755844 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" podUID="67280a26-5583-4b51-b74d-c3e8b2ea6645" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.177:9311/healthcheck\": dial tcp 10.217.0.177:9311: connect: connection refused" Jan 28 07:08:58 crc kubenswrapper[4776]: I0128 07:08:58.760791 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" podUID="67280a26-5583-4b51-b74d-c3e8b2ea6645" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.177:9311/healthcheck\": dial tcp 10.217.0.177:9311: connect: connection refused" Jan 28 07:08:58 crc kubenswrapper[4776]: I0128 07:08:58.809939 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.107794 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.212275 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z467\" (UniqueName: \"kubernetes.io/projected/67280a26-5583-4b51-b74d-c3e8b2ea6645-kube-api-access-8z467\") pod \"67280a26-5583-4b51-b74d-c3e8b2ea6645\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.212313 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-combined-ca-bundle\") pod \"67280a26-5583-4b51-b74d-c3e8b2ea6645\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.212394 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-config-data\") pod \"67280a26-5583-4b51-b74d-c3e8b2ea6645\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.212447 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67280a26-5583-4b51-b74d-c3e8b2ea6645-logs\") pod \"67280a26-5583-4b51-b74d-c3e8b2ea6645\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.212515 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-config-data-custom\") pod \"67280a26-5583-4b51-b74d-c3e8b2ea6645\" (UID: \"67280a26-5583-4b51-b74d-c3e8b2ea6645\") " Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.214275 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67280a26-5583-4b51-b74d-c3e8b2ea6645-logs" (OuterVolumeSpecName: "logs") pod "67280a26-5583-4b51-b74d-c3e8b2ea6645" (UID: "67280a26-5583-4b51-b74d-c3e8b2ea6645"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.218705 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67280a26-5583-4b51-b74d-c3e8b2ea6645-kube-api-access-8z467" (OuterVolumeSpecName: "kube-api-access-8z467") pod "67280a26-5583-4b51-b74d-c3e8b2ea6645" (UID: "67280a26-5583-4b51-b74d-c3e8b2ea6645"). InnerVolumeSpecName "kube-api-access-8z467". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.219358 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "67280a26-5583-4b51-b74d-c3e8b2ea6645" (UID: "67280a26-5583-4b51-b74d-c3e8b2ea6645"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.243770 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67280a26-5583-4b51-b74d-c3e8b2ea6645" (UID: "67280a26-5583-4b51-b74d-c3e8b2ea6645"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.283041 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-config-data" (OuterVolumeSpecName: "config-data") pod "67280a26-5583-4b51-b74d-c3e8b2ea6645" (UID: "67280a26-5583-4b51-b74d-c3e8b2ea6645"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.314571 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z467\" (UniqueName: \"kubernetes.io/projected/67280a26-5583-4b51-b74d-c3e8b2ea6645-kube-api-access-8z467\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.314600 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.314613 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.314625 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67280a26-5583-4b51-b74d-c3e8b2ea6645-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.314637 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67280a26-5583-4b51-b74d-c3e8b2ea6645-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.546886 4776 generic.go:334] "Generic (PLEG): container finished" podID="67280a26-5583-4b51-b74d-c3e8b2ea6645" containerID="23fbf1c9785b9a94be009a7433f7b499907b6ba866fb9ebe80c4bf80e8e0acc8" exitCode=0 Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.546953 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.546959 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" event={"ID":"67280a26-5583-4b51-b74d-c3e8b2ea6645","Type":"ContainerDied","Data":"23fbf1c9785b9a94be009a7433f7b499907b6ba866fb9ebe80c4bf80e8e0acc8"} Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.547446 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f8bdb5bcd-hfpbh" event={"ID":"67280a26-5583-4b51-b74d-c3e8b2ea6645","Type":"ContainerDied","Data":"b42c6d9d677c037bbecfa068a9d87b2426aaa24828e777aaea0f4ff8f99400c5"} Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.547483 4776 scope.go:117] "RemoveContainer" containerID="23fbf1c9785b9a94be009a7433f7b499907b6ba866fb9ebe80c4bf80e8e0acc8" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.577823 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5f8bdb5bcd-hfpbh"] Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.578822 4776 scope.go:117] "RemoveContainer" containerID="e4fbff98c2922fc9943bdbc500cd81eb6b0067debf0fa2f64e36da2689edad94" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.588742 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5f8bdb5bcd-hfpbh"] Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.603269 4776 scope.go:117] "RemoveContainer" containerID="23fbf1c9785b9a94be009a7433f7b499907b6ba866fb9ebe80c4bf80e8e0acc8" Jan 28 07:08:59 crc kubenswrapper[4776]: E0128 07:08:59.604054 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23fbf1c9785b9a94be009a7433f7b499907b6ba866fb9ebe80c4bf80e8e0acc8\": container with ID starting with 23fbf1c9785b9a94be009a7433f7b499907b6ba866fb9ebe80c4bf80e8e0acc8 not found: ID does not exist" containerID="23fbf1c9785b9a94be009a7433f7b499907b6ba866fb9ebe80c4bf80e8e0acc8" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.604105 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23fbf1c9785b9a94be009a7433f7b499907b6ba866fb9ebe80c4bf80e8e0acc8"} err="failed to get container status \"23fbf1c9785b9a94be009a7433f7b499907b6ba866fb9ebe80c4bf80e8e0acc8\": rpc error: code = NotFound desc = could not find container \"23fbf1c9785b9a94be009a7433f7b499907b6ba866fb9ebe80c4bf80e8e0acc8\": container with ID starting with 23fbf1c9785b9a94be009a7433f7b499907b6ba866fb9ebe80c4bf80e8e0acc8 not found: ID does not exist" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.604137 4776 scope.go:117] "RemoveContainer" containerID="e4fbff98c2922fc9943bdbc500cd81eb6b0067debf0fa2f64e36da2689edad94" Jan 28 07:08:59 crc kubenswrapper[4776]: E0128 07:08:59.604539 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4fbff98c2922fc9943bdbc500cd81eb6b0067debf0fa2f64e36da2689edad94\": container with ID starting with e4fbff98c2922fc9943bdbc500cd81eb6b0067debf0fa2f64e36da2689edad94 not found: ID does not exist" containerID="e4fbff98c2922fc9943bdbc500cd81eb6b0067debf0fa2f64e36da2689edad94" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.604586 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fbff98c2922fc9943bdbc500cd81eb6b0067debf0fa2f64e36da2689edad94"} err="failed to get container status \"e4fbff98c2922fc9943bdbc500cd81eb6b0067debf0fa2f64e36da2689edad94\": rpc error: code = NotFound desc = could not find container \"e4fbff98c2922fc9943bdbc500cd81eb6b0067debf0fa2f64e36da2689edad94\": container with ID starting with e4fbff98c2922fc9943bdbc500cd81eb6b0067debf0fa2f64e36da2689edad94 not found: ID does not exist" Jan 28 07:08:59 crc kubenswrapper[4776]: I0128 07:08:59.885229 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Jan 28 07:09:00 crc kubenswrapper[4776]: I0128 07:09:00.980929 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-979f97b77-x2lng" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.090758 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.159887 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c200544-1881-4bdf-9e84-78ff9ccd8712-etc-machine-id\") pod \"0c200544-1881-4bdf-9e84-78ff9ccd8712\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.160043 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c200544-1881-4bdf-9e84-78ff9ccd8712-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0c200544-1881-4bdf-9e84-78ff9ccd8712" (UID: "0c200544-1881-4bdf-9e84-78ff9ccd8712"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.160053 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-config-data\") pod \"0c200544-1881-4bdf-9e84-78ff9ccd8712\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.160146 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phhn4\" (UniqueName: \"kubernetes.io/projected/0c200544-1881-4bdf-9e84-78ff9ccd8712-kube-api-access-phhn4\") pod \"0c200544-1881-4bdf-9e84-78ff9ccd8712\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.160290 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-config-data-custom\") pod \"0c200544-1881-4bdf-9e84-78ff9ccd8712\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.160363 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-combined-ca-bundle\") pod \"0c200544-1881-4bdf-9e84-78ff9ccd8712\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.160447 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-scripts\") pod \"0c200544-1881-4bdf-9e84-78ff9ccd8712\" (UID: \"0c200544-1881-4bdf-9e84-78ff9ccd8712\") " Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.161281 4776 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c200544-1881-4bdf-9e84-78ff9ccd8712-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.165472 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0c200544-1881-4bdf-9e84-78ff9ccd8712" (UID: "0c200544-1881-4bdf-9e84-78ff9ccd8712"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.166628 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c200544-1881-4bdf-9e84-78ff9ccd8712-kube-api-access-phhn4" (OuterVolumeSpecName: "kube-api-access-phhn4") pod "0c200544-1881-4bdf-9e84-78ff9ccd8712" (UID: "0c200544-1881-4bdf-9e84-78ff9ccd8712"). InnerVolumeSpecName "kube-api-access-phhn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.179101 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-scripts" (OuterVolumeSpecName: "scripts") pod "0c200544-1881-4bdf-9e84-78ff9ccd8712" (UID: "0c200544-1881-4bdf-9e84-78ff9ccd8712"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.224393 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c200544-1881-4bdf-9e84-78ff9ccd8712" (UID: "0c200544-1881-4bdf-9e84-78ff9ccd8712"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.263025 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.263049 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phhn4\" (UniqueName: \"kubernetes.io/projected/0c200544-1881-4bdf-9e84-78ff9ccd8712-kube-api-access-phhn4\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.263060 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.263069 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.267672 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-config-data" (OuterVolumeSpecName: "config-data") pod "0c200544-1881-4bdf-9e84-78ff9ccd8712" (UID: "0c200544-1881-4bdf-9e84-78ff9ccd8712"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.315210 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67280a26-5583-4b51-b74d-c3e8b2ea6645" path="/var/lib/kubelet/pods/67280a26-5583-4b51-b74d-c3e8b2ea6645/volumes" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.364346 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c200544-1881-4bdf-9e84-78ff9ccd8712-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.571064 4776 generic.go:334] "Generic (PLEG): container finished" podID="0c200544-1881-4bdf-9e84-78ff9ccd8712" containerID="f22d6d0917682bf84ee80b0f2006fa363a35d242b171283ad76c91aa3e0c391a" exitCode=0 Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.571118 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0c200544-1881-4bdf-9e84-78ff9ccd8712","Type":"ContainerDied","Data":"f22d6d0917682bf84ee80b0f2006fa363a35d242b171283ad76c91aa3e0c391a"} Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.571174 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0c200544-1881-4bdf-9e84-78ff9ccd8712","Type":"ContainerDied","Data":"e9513817e7d010d87216954fe669fc4175b587b369c25b1bd087c8e4a6274158"} Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.571191 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.571200 4776 scope.go:117] "RemoveContainer" containerID="33048e7ea047aed28f23683bfdfb07e14ab44c506ed91098917b6c2f2c3603df" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.596087 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.601382 4776 scope.go:117] "RemoveContainer" containerID="f22d6d0917682bf84ee80b0f2006fa363a35d242b171283ad76c91aa3e0c391a" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.607039 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.632433 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 07:09:01 crc kubenswrapper[4776]: E0128 07:09:01.632862 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67280a26-5583-4b51-b74d-c3e8b2ea6645" containerName="barbican-api-log" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.632904 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="67280a26-5583-4b51-b74d-c3e8b2ea6645" containerName="barbican-api-log" Jan 28 07:09:01 crc kubenswrapper[4776]: E0128 07:09:01.632925 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67280a26-5583-4b51-b74d-c3e8b2ea6645" containerName="barbican-api" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.632931 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="67280a26-5583-4b51-b74d-c3e8b2ea6645" containerName="barbican-api" Jan 28 07:09:01 crc kubenswrapper[4776]: E0128 07:09:01.632939 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5bc5428-35ca-44a5-8fa9-7d11ec4c6804" containerName="dnsmasq-dns" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.632945 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5bc5428-35ca-44a5-8fa9-7d11ec4c6804" containerName="dnsmasq-dns" Jan 28 07:09:01 crc kubenswrapper[4776]: E0128 07:09:01.632956 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c200544-1881-4bdf-9e84-78ff9ccd8712" containerName="probe" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.632963 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c200544-1881-4bdf-9e84-78ff9ccd8712" containerName="probe" Jan 28 07:09:01 crc kubenswrapper[4776]: E0128 07:09:01.632971 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c200544-1881-4bdf-9e84-78ff9ccd8712" containerName="cinder-scheduler" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.632976 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c200544-1881-4bdf-9e84-78ff9ccd8712" containerName="cinder-scheduler" Jan 28 07:09:01 crc kubenswrapper[4776]: E0128 07:09:01.632991 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5bc5428-35ca-44a5-8fa9-7d11ec4c6804" containerName="init" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.632996 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5bc5428-35ca-44a5-8fa9-7d11ec4c6804" containerName="init" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.633176 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c200544-1881-4bdf-9e84-78ff9ccd8712" containerName="probe" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.633184 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c200544-1881-4bdf-9e84-78ff9ccd8712" containerName="cinder-scheduler" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.633239 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="67280a26-5583-4b51-b74d-c3e8b2ea6645" containerName="barbican-api-log" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.633255 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="67280a26-5583-4b51-b74d-c3e8b2ea6645" containerName="barbican-api" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.633265 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5bc5428-35ca-44a5-8fa9-7d11ec4c6804" containerName="dnsmasq-dns" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.634304 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.638590 4776 scope.go:117] "RemoveContainer" containerID="33048e7ea047aed28f23683bfdfb07e14ab44c506ed91098917b6c2f2c3603df" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.638833 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 28 07:09:01 crc kubenswrapper[4776]: E0128 07:09:01.640179 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33048e7ea047aed28f23683bfdfb07e14ab44c506ed91098917b6c2f2c3603df\": container with ID starting with 33048e7ea047aed28f23683bfdfb07e14ab44c506ed91098917b6c2f2c3603df not found: ID does not exist" containerID="33048e7ea047aed28f23683bfdfb07e14ab44c506ed91098917b6c2f2c3603df" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.640234 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33048e7ea047aed28f23683bfdfb07e14ab44c506ed91098917b6c2f2c3603df"} err="failed to get container status \"33048e7ea047aed28f23683bfdfb07e14ab44c506ed91098917b6c2f2c3603df\": rpc error: code = NotFound desc = could not find container \"33048e7ea047aed28f23683bfdfb07e14ab44c506ed91098917b6c2f2c3603df\": container with ID starting with 33048e7ea047aed28f23683bfdfb07e14ab44c506ed91098917b6c2f2c3603df not found: ID does not exist" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.640262 4776 scope.go:117] "RemoveContainer" containerID="f22d6d0917682bf84ee80b0f2006fa363a35d242b171283ad76c91aa3e0c391a" Jan 28 07:09:01 crc kubenswrapper[4776]: E0128 07:09:01.640872 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f22d6d0917682bf84ee80b0f2006fa363a35d242b171283ad76c91aa3e0c391a\": container with ID starting with f22d6d0917682bf84ee80b0f2006fa363a35d242b171283ad76c91aa3e0c391a not found: ID does not exist" containerID="f22d6d0917682bf84ee80b0f2006fa363a35d242b171283ad76c91aa3e0c391a" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.640900 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22d6d0917682bf84ee80b0f2006fa363a35d242b171283ad76c91aa3e0c391a"} err="failed to get container status \"f22d6d0917682bf84ee80b0f2006fa363a35d242b171283ad76c91aa3e0c391a\": rpc error: code = NotFound desc = could not find container \"f22d6d0917682bf84ee80b0f2006fa363a35d242b171283ad76c91aa3e0c391a\": container with ID starting with f22d6d0917682bf84ee80b0f2006fa363a35d242b171283ad76c91aa3e0c391a not found: ID does not exist" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.647789 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.715595 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.716905 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.724140 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.730627 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.730728 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-nw4lv" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.730882 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.772019 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39df015-b6fa-40eb-b270-21f01f3cb141-config-data\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.772072 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39df015-b6fa-40eb-b270-21f01f3cb141-scripts\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.772132 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b39df015-b6fa-40eb-b270-21f01f3cb141-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.772372 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-972ft\" (UniqueName: \"kubernetes.io/projected/b39df015-b6fa-40eb-b270-21f01f3cb141-kube-api-access-972ft\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.772492 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b39df015-b6fa-40eb-b270-21f01f3cb141-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.772536 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b39df015-b6fa-40eb-b270-21f01f3cb141-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.874881 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39df015-b6fa-40eb-b270-21f01f3cb141-scripts\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.874972 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b39df015-b6fa-40eb-b270-21f01f3cb141-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.875003 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/abb12d44-fb9e-4ac4-95ad-a82606ff0709-openstack-config\") pod \"openstackclient\" (UID: \"abb12d44-fb9e-4ac4-95ad-a82606ff0709\") " pod="openstack/openstackclient" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.875073 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-972ft\" (UniqueName: \"kubernetes.io/projected/b39df015-b6fa-40eb-b270-21f01f3cb141-kube-api-access-972ft\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.875079 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b39df015-b6fa-40eb-b270-21f01f3cb141-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.875130 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98nj4\" (UniqueName: \"kubernetes.io/projected/abb12d44-fb9e-4ac4-95ad-a82606ff0709-kube-api-access-98nj4\") pod \"openstackclient\" (UID: \"abb12d44-fb9e-4ac4-95ad-a82606ff0709\") " pod="openstack/openstackclient" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.875156 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b39df015-b6fa-40eb-b270-21f01f3cb141-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.875401 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b39df015-b6fa-40eb-b270-21f01f3cb141-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.875525 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb12d44-fb9e-4ac4-95ad-a82606ff0709-combined-ca-bundle\") pod \"openstackclient\" (UID: \"abb12d44-fb9e-4ac4-95ad-a82606ff0709\") " pod="openstack/openstackclient" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.875604 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/abb12d44-fb9e-4ac4-95ad-a82606ff0709-openstack-config-secret\") pod \"openstackclient\" (UID: \"abb12d44-fb9e-4ac4-95ad-a82606ff0709\") " pod="openstack/openstackclient" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.875660 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39df015-b6fa-40eb-b270-21f01f3cb141-config-data\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.881387 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b39df015-b6fa-40eb-b270-21f01f3cb141-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.882171 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39df015-b6fa-40eb-b270-21f01f3cb141-config-data\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.883282 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b39df015-b6fa-40eb-b270-21f01f3cb141-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.889800 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39df015-b6fa-40eb-b270-21f01f3cb141-scripts\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.901840 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-972ft\" (UniqueName: \"kubernetes.io/projected/b39df015-b6fa-40eb-b270-21f01f3cb141-kube-api-access-972ft\") pod \"cinder-scheduler-0\" (UID: \"b39df015-b6fa-40eb-b270-21f01f3cb141\") " pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.952699 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.982463 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/abb12d44-fb9e-4ac4-95ad-a82606ff0709-openstack-config\") pod \"openstackclient\" (UID: \"abb12d44-fb9e-4ac4-95ad-a82606ff0709\") " pod="openstack/openstackclient" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.982720 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98nj4\" (UniqueName: \"kubernetes.io/projected/abb12d44-fb9e-4ac4-95ad-a82606ff0709-kube-api-access-98nj4\") pod \"openstackclient\" (UID: \"abb12d44-fb9e-4ac4-95ad-a82606ff0709\") " pod="openstack/openstackclient" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.982824 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb12d44-fb9e-4ac4-95ad-a82606ff0709-combined-ca-bundle\") pod \"openstackclient\" (UID: \"abb12d44-fb9e-4ac4-95ad-a82606ff0709\") " pod="openstack/openstackclient" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.982884 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/abb12d44-fb9e-4ac4-95ad-a82606ff0709-openstack-config-secret\") pod \"openstackclient\" (UID: \"abb12d44-fb9e-4ac4-95ad-a82606ff0709\") " pod="openstack/openstackclient" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.985310 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/abb12d44-fb9e-4ac4-95ad-a82606ff0709-openstack-config\") pod \"openstackclient\" (UID: \"abb12d44-fb9e-4ac4-95ad-a82606ff0709\") " pod="openstack/openstackclient" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.987470 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/abb12d44-fb9e-4ac4-95ad-a82606ff0709-openstack-config-secret\") pod \"openstackclient\" (UID: \"abb12d44-fb9e-4ac4-95ad-a82606ff0709\") " pod="openstack/openstackclient" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.994076 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb12d44-fb9e-4ac4-95ad-a82606ff0709-combined-ca-bundle\") pod \"openstackclient\" (UID: \"abb12d44-fb9e-4ac4-95ad-a82606ff0709\") " pod="openstack/openstackclient" Jan 28 07:09:01 crc kubenswrapper[4776]: I0128 07:09:01.998776 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98nj4\" (UniqueName: \"kubernetes.io/projected/abb12d44-fb9e-4ac4-95ad-a82606ff0709-kube-api-access-98nj4\") pod \"openstackclient\" (UID: \"abb12d44-fb9e-4ac4-95ad-a82606ff0709\") " pod="openstack/openstackclient" Jan 28 07:09:02 crc kubenswrapper[4776]: I0128 07:09:02.040939 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 28 07:09:02 crc kubenswrapper[4776]: W0128 07:09:02.455977 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb39df015_b6fa_40eb_b270_21f01f3cb141.slice/crio-c1800a4882342b8304a1317eeeaee74e7bf571c61b3bb9f9ce7235de8f71aaa5 WatchSource:0}: Error finding container c1800a4882342b8304a1317eeeaee74e7bf571c61b3bb9f9ce7235de8f71aaa5: Status 404 returned error can't find the container with id c1800a4882342b8304a1317eeeaee74e7bf571c61b3bb9f9ce7235de8f71aaa5 Jan 28 07:09:02 crc kubenswrapper[4776]: I0128 07:09:02.457768 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 07:09:02 crc kubenswrapper[4776]: I0128 07:09:02.595671 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b39df015-b6fa-40eb-b270-21f01f3cb141","Type":"ContainerStarted","Data":"c1800a4882342b8304a1317eeeaee74e7bf571c61b3bb9f9ce7235de8f71aaa5"} Jan 28 07:09:02 crc kubenswrapper[4776]: I0128 07:09:02.630184 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 28 07:09:02 crc kubenswrapper[4776]: I0128 07:09:02.716143 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7967c58c5f-jkrnc" Jan 28 07:09:02 crc kubenswrapper[4776]: I0128 07:09:02.790249 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-659688b465-m49kr"] Jan 28 07:09:02 crc kubenswrapper[4776]: I0128 07:09:02.790489 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-659688b465-m49kr" podUID="f4598f22-8a4c-4050-a2ea-011675c33d1f" containerName="neutron-api" containerID="cri-o://08d714467f191d11aa01f9db6668796c34c62f63f6c89edf434bc9040ffabc70" gracePeriod=30 Jan 28 07:09:02 crc kubenswrapper[4776]: I0128 07:09:02.790654 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-659688b465-m49kr" podUID="f4598f22-8a4c-4050-a2ea-011675c33d1f" containerName="neutron-httpd" containerID="cri-o://3679ed31d37acd7d8baab6f3f1b1e0764cb0eaed5bd02b59a650157b5400295a" gracePeriod=30 Jan 28 07:09:03 crc kubenswrapper[4776]: I0128 07:09:03.313162 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c200544-1881-4bdf-9e84-78ff9ccd8712" path="/var/lib/kubelet/pods/0c200544-1881-4bdf-9e84-78ff9ccd8712/volumes" Jan 28 07:09:03 crc kubenswrapper[4776]: I0128 07:09:03.609262 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4598f22-8a4c-4050-a2ea-011675c33d1f" containerID="3679ed31d37acd7d8baab6f3f1b1e0764cb0eaed5bd02b59a650157b5400295a" exitCode=0 Jan 28 07:09:03 crc kubenswrapper[4776]: I0128 07:09:03.609501 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-659688b465-m49kr" event={"ID":"f4598f22-8a4c-4050-a2ea-011675c33d1f","Type":"ContainerDied","Data":"3679ed31d37acd7d8baab6f3f1b1e0764cb0eaed5bd02b59a650157b5400295a"} Jan 28 07:09:03 crc kubenswrapper[4776]: I0128 07:09:03.611096 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"abb12d44-fb9e-4ac4-95ad-a82606ff0709","Type":"ContainerStarted","Data":"75e337bbdf8da37c77d5960ab2fcba3f39aee8dd33bf4e21f50959a8afaa319d"} Jan 28 07:09:04 crc kubenswrapper[4776]: I0128 07:09:04.333535 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:09:04 crc kubenswrapper[4776]: I0128 07:09:04.366738 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-65b6d456f6-wvlnv" Jan 28 07:09:04 crc kubenswrapper[4776]: I0128 07:09:04.637625 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b39df015-b6fa-40eb-b270-21f01f3cb141","Type":"ContainerStarted","Data":"cdc28715afd20cc79c81744de3a3490f0f3946548c48aa5764c672441e64ccf5"} Jan 28 07:09:04 crc kubenswrapper[4776]: I0128 07:09:04.637676 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b39df015-b6fa-40eb-b270-21f01f3cb141","Type":"ContainerStarted","Data":"737c78fe7ee5948aceca120cdc648c08ee4e560a4516852d27753859cbe143a6"} Jan 28 07:09:04 crc kubenswrapper[4776]: I0128 07:09:04.659841 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.659821879 podStartE2EDuration="3.659821879s" podCreationTimestamp="2026-01-28 07:09:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:09:04.656932881 +0000 UTC m=+1116.072593041" watchObservedRunningTime="2026-01-28 07:09:04.659821879 +0000 UTC m=+1116.075482039" Jan 28 07:09:04 crc kubenswrapper[4776]: I0128 07:09:04.885446 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Jan 28 07:09:04 crc kubenswrapper[4776]: I0128 07:09:04.898905 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Jan 28 07:09:05 crc kubenswrapper[4776]: I0128 07:09:05.192956 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 28 07:09:05 crc kubenswrapper[4776]: I0128 07:09:05.661198 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Jan 28 07:09:06 crc kubenswrapper[4776]: I0128 07:09:06.940518 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c7f79f5b8-2xn7l" podUID="e5180ed1-0d82-4c44-aed4-3f3a5b34af93" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.160:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.160:8443: connect: connection refused" Jan 28 07:09:06 crc kubenswrapper[4776]: I0128 07:09:06.953273 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.284452 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-659688b465-m49kr" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.401043 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-config\") pod \"f4598f22-8a4c-4050-a2ea-011675c33d1f\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.401124 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-httpd-config\") pod \"f4598f22-8a4c-4050-a2ea-011675c33d1f\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.401174 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-internal-tls-certs\") pod \"f4598f22-8a4c-4050-a2ea-011675c33d1f\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.401258 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf6gj\" (UniqueName: \"kubernetes.io/projected/f4598f22-8a4c-4050-a2ea-011675c33d1f-kube-api-access-wf6gj\") pod \"f4598f22-8a4c-4050-a2ea-011675c33d1f\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.401291 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-ovndb-tls-certs\") pod \"f4598f22-8a4c-4050-a2ea-011675c33d1f\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.401361 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-combined-ca-bundle\") pod \"f4598f22-8a4c-4050-a2ea-011675c33d1f\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.401418 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-public-tls-certs\") pod \"f4598f22-8a4c-4050-a2ea-011675c33d1f\" (UID: \"f4598f22-8a4c-4050-a2ea-011675c33d1f\") " Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.408963 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4598f22-8a4c-4050-a2ea-011675c33d1f-kube-api-access-wf6gj" (OuterVolumeSpecName: "kube-api-access-wf6gj") pod "f4598f22-8a4c-4050-a2ea-011675c33d1f" (UID: "f4598f22-8a4c-4050-a2ea-011675c33d1f"). InnerVolumeSpecName "kube-api-access-wf6gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.409855 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f4598f22-8a4c-4050-a2ea-011675c33d1f" (UID: "f4598f22-8a4c-4050-a2ea-011675c33d1f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.461974 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f4598f22-8a4c-4050-a2ea-011675c33d1f" (UID: "f4598f22-8a4c-4050-a2ea-011675c33d1f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.475262 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-config" (OuterVolumeSpecName: "config") pod "f4598f22-8a4c-4050-a2ea-011675c33d1f" (UID: "f4598f22-8a4c-4050-a2ea-011675c33d1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.482317 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f4598f22-8a4c-4050-a2ea-011675c33d1f" (UID: "f4598f22-8a4c-4050-a2ea-011675c33d1f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.503575 4776 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.503618 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.503633 4776 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.503646 4776 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.503659 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf6gj\" (UniqueName: \"kubernetes.io/projected/f4598f22-8a4c-4050-a2ea-011675c33d1f-kube-api-access-wf6gj\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.512946 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f4598f22-8a4c-4050-a2ea-011675c33d1f" (UID: "f4598f22-8a4c-4050-a2ea-011675c33d1f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.522873 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4598f22-8a4c-4050-a2ea-011675c33d1f" (UID: "f4598f22-8a4c-4050-a2ea-011675c33d1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.605087 4776 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.605125 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4598f22-8a4c-4050-a2ea-011675c33d1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.695344 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-76c7dcc8f9-s7l5r"] Jan 28 07:09:07 crc kubenswrapper[4776]: E0128 07:09:07.695829 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4598f22-8a4c-4050-a2ea-011675c33d1f" containerName="neutron-httpd" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.695852 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4598f22-8a4c-4050-a2ea-011675c33d1f" containerName="neutron-httpd" Jan 28 07:09:07 crc kubenswrapper[4776]: E0128 07:09:07.695876 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4598f22-8a4c-4050-a2ea-011675c33d1f" containerName="neutron-api" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.695887 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4598f22-8a4c-4050-a2ea-011675c33d1f" containerName="neutron-api" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.696105 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4598f22-8a4c-4050-a2ea-011675c33d1f" containerName="neutron-httpd" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.696132 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4598f22-8a4c-4050-a2ea-011675c33d1f" containerName="neutron-api" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.697100 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.704108 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.704200 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.704871 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4598f22-8a4c-4050-a2ea-011675c33d1f" containerID="08d714467f191d11aa01f9db6668796c34c62f63f6c89edf434bc9040ffabc70" exitCode=0 Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.705636 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-659688b465-m49kr" event={"ID":"f4598f22-8a4c-4050-a2ea-011675c33d1f","Type":"ContainerDied","Data":"08d714467f191d11aa01f9db6668796c34c62f63f6c89edf434bc9040ffabc70"} Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.705670 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-659688b465-m49kr" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.705692 4776 scope.go:117] "RemoveContainer" containerID="3679ed31d37acd7d8baab6f3f1b1e0764cb0eaed5bd02b59a650157b5400295a" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.705679 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-659688b465-m49kr" event={"ID":"f4598f22-8a4c-4050-a2ea-011675c33d1f","Type":"ContainerDied","Data":"869dcbf82905f91282ba235e05da70b70179d65631d1fd86fd63d71cdb79694b"} Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.708503 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.719345 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-76c7dcc8f9-s7l5r"] Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.804323 4776 scope.go:117] "RemoveContainer" containerID="08d714467f191d11aa01f9db6668796c34c62f63f6c89edf434bc9040ffabc70" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.805170 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-659688b465-m49kr"] Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.809681 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/157a7da1-1327-40fc-83f3-30c1ef472c78-combined-ca-bundle\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.809734 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gcgp\" (UniqueName: \"kubernetes.io/projected/157a7da1-1327-40fc-83f3-30c1ef472c78-kube-api-access-8gcgp\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.809825 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/157a7da1-1327-40fc-83f3-30c1ef472c78-log-httpd\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.809853 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/157a7da1-1327-40fc-83f3-30c1ef472c78-config-data\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.809891 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/157a7da1-1327-40fc-83f3-30c1ef472c78-etc-swift\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.809913 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/157a7da1-1327-40fc-83f3-30c1ef472c78-run-httpd\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.809949 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/157a7da1-1327-40fc-83f3-30c1ef472c78-public-tls-certs\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.810010 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/157a7da1-1327-40fc-83f3-30c1ef472c78-internal-tls-certs\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.813457 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-659688b465-m49kr"] Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.835954 4776 scope.go:117] "RemoveContainer" containerID="3679ed31d37acd7d8baab6f3f1b1e0764cb0eaed5bd02b59a650157b5400295a" Jan 28 07:09:07 crc kubenswrapper[4776]: E0128 07:09:07.836917 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3679ed31d37acd7d8baab6f3f1b1e0764cb0eaed5bd02b59a650157b5400295a\": container with ID starting with 3679ed31d37acd7d8baab6f3f1b1e0764cb0eaed5bd02b59a650157b5400295a not found: ID does not exist" containerID="3679ed31d37acd7d8baab6f3f1b1e0764cb0eaed5bd02b59a650157b5400295a" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.836954 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3679ed31d37acd7d8baab6f3f1b1e0764cb0eaed5bd02b59a650157b5400295a"} err="failed to get container status \"3679ed31d37acd7d8baab6f3f1b1e0764cb0eaed5bd02b59a650157b5400295a\": rpc error: code = NotFound desc = could not find container \"3679ed31d37acd7d8baab6f3f1b1e0764cb0eaed5bd02b59a650157b5400295a\": container with ID starting with 3679ed31d37acd7d8baab6f3f1b1e0764cb0eaed5bd02b59a650157b5400295a not found: ID does not exist" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.836991 4776 scope.go:117] "RemoveContainer" containerID="08d714467f191d11aa01f9db6668796c34c62f63f6c89edf434bc9040ffabc70" Jan 28 07:09:07 crc kubenswrapper[4776]: E0128 07:09:07.837330 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08d714467f191d11aa01f9db6668796c34c62f63f6c89edf434bc9040ffabc70\": container with ID starting with 08d714467f191d11aa01f9db6668796c34c62f63f6c89edf434bc9040ffabc70 not found: ID does not exist" containerID="08d714467f191d11aa01f9db6668796c34c62f63f6c89edf434bc9040ffabc70" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.837352 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08d714467f191d11aa01f9db6668796c34c62f63f6c89edf434bc9040ffabc70"} err="failed to get container status \"08d714467f191d11aa01f9db6668796c34c62f63f6c89edf434bc9040ffabc70\": rpc error: code = NotFound desc = could not find container \"08d714467f191d11aa01f9db6668796c34c62f63f6c89edf434bc9040ffabc70\": container with ID starting with 08d714467f191d11aa01f9db6668796c34c62f63f6c89edf434bc9040ffabc70 not found: ID does not exist" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.913962 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/157a7da1-1327-40fc-83f3-30c1ef472c78-combined-ca-bundle\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.914157 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gcgp\" (UniqueName: \"kubernetes.io/projected/157a7da1-1327-40fc-83f3-30c1ef472c78-kube-api-access-8gcgp\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.914339 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/157a7da1-1327-40fc-83f3-30c1ef472c78-log-httpd\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.914386 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/157a7da1-1327-40fc-83f3-30c1ef472c78-config-data\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.914470 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/157a7da1-1327-40fc-83f3-30c1ef472c78-etc-swift\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.914533 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/157a7da1-1327-40fc-83f3-30c1ef472c78-run-httpd\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.914710 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/157a7da1-1327-40fc-83f3-30c1ef472c78-public-tls-certs\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.914832 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/157a7da1-1327-40fc-83f3-30c1ef472c78-log-httpd\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.915412 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/157a7da1-1327-40fc-83f3-30c1ef472c78-run-httpd\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.915682 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/157a7da1-1327-40fc-83f3-30c1ef472c78-internal-tls-certs\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.920908 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/157a7da1-1327-40fc-83f3-30c1ef472c78-internal-tls-certs\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.921411 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/157a7da1-1327-40fc-83f3-30c1ef472c78-config-data\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.921913 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/157a7da1-1327-40fc-83f3-30c1ef472c78-etc-swift\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.922933 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/157a7da1-1327-40fc-83f3-30c1ef472c78-public-tls-certs\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.933048 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/157a7da1-1327-40fc-83f3-30c1ef472c78-combined-ca-bundle\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:07 crc kubenswrapper[4776]: I0128 07:09:07.938127 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gcgp\" (UniqueName: \"kubernetes.io/projected/157a7da1-1327-40fc-83f3-30c1ef472c78-kube-api-access-8gcgp\") pod \"swift-proxy-76c7dcc8f9-s7l5r\" (UID: \"157a7da1-1327-40fc-83f3-30c1ef472c78\") " pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:08 crc kubenswrapper[4776]: I0128 07:09:08.083191 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:08 crc kubenswrapper[4776]: I0128 07:09:08.684835 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-76c7dcc8f9-s7l5r"] Jan 28 07:09:08 crc kubenswrapper[4776]: W0128 07:09:08.687424 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod157a7da1_1327_40fc_83f3_30c1ef472c78.slice/crio-62fd910a5109dc1922f2b9602b90125b03c08869ea2e0873beaa82b223ecf42f WatchSource:0}: Error finding container 62fd910a5109dc1922f2b9602b90125b03c08869ea2e0873beaa82b223ecf42f: Status 404 returned error can't find the container with id 62fd910a5109dc1922f2b9602b90125b03c08869ea2e0873beaa82b223ecf42f Jan 28 07:09:08 crc kubenswrapper[4776]: I0128 07:09:08.720744 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" event={"ID":"157a7da1-1327-40fc-83f3-30c1ef472c78","Type":"ContainerStarted","Data":"62fd910a5109dc1922f2b9602b90125b03c08869ea2e0873beaa82b223ecf42f"} Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.001989 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.002496 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerName="ceilometer-central-agent" containerID="cri-o://4ce6a80eba720f8d0f0e2a150c530cf0b1d47f34ce4a588b47f4dd8e4425cb3d" gracePeriod=30 Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.002627 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerName="proxy-httpd" containerID="cri-o://41368e882e48f9d639a73241f1cb38230da29a960d7f1544dfb8f54639d4642e" gracePeriod=30 Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.002661 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerName="sg-core" containerID="cri-o://a6c345e15975fc0ea1f2d97db3df8e1cfce38515f9b1553da0ef391a6afece8b" gracePeriod=30 Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.002690 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerName="ceilometer-notification-agent" containerID="cri-o://7bcbe97d50dfb8d2eed7c5cdc47883a6e058b6fb2a3d9e3b47a75dce8b63edce" gracePeriod=30 Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.032775 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.329300 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4598f22-8a4c-4050-a2ea-011675c33d1f" path="/var/lib/kubelet/pods/f4598f22-8a4c-4050-a2ea-011675c33d1f/volumes" Jan 28 07:09:09 crc kubenswrapper[4776]: E0128 07:09:09.351260 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod521b11a5_ef9c_4f72_94c9_6ea74c3f687b.slice/crio-41368e882e48f9d639a73241f1cb38230da29a960d7f1544dfb8f54639d4642e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod521b11a5_ef9c_4f72_94c9_6ea74c3f687b.slice/crio-a6c345e15975fc0ea1f2d97db3df8e1cfce38515f9b1553da0ef391a6afece8b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod521b11a5_ef9c_4f72_94c9_6ea74c3f687b.slice/crio-conmon-a6c345e15975fc0ea1f2d97db3df8e1cfce38515f9b1553da0ef391a6afece8b.scope\": RecentStats: unable to find data in memory cache]" Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.743139 4776 generic.go:334] "Generic (PLEG): container finished" podID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerID="41368e882e48f9d639a73241f1cb38230da29a960d7f1544dfb8f54639d4642e" exitCode=0 Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.743443 4776 generic.go:334] "Generic (PLEG): container finished" podID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerID="a6c345e15975fc0ea1f2d97db3df8e1cfce38515f9b1553da0ef391a6afece8b" exitCode=2 Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.743453 4776 generic.go:334] "Generic (PLEG): container finished" podID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerID="4ce6a80eba720f8d0f0e2a150c530cf0b1d47f34ce4a588b47f4dd8e4425cb3d" exitCode=0 Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.743490 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"521b11a5-ef9c-4f72-94c9-6ea74c3f687b","Type":"ContainerDied","Data":"41368e882e48f9d639a73241f1cb38230da29a960d7f1544dfb8f54639d4642e"} Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.743512 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"521b11a5-ef9c-4f72-94c9-6ea74c3f687b","Type":"ContainerDied","Data":"a6c345e15975fc0ea1f2d97db3df8e1cfce38515f9b1553da0ef391a6afece8b"} Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.743523 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"521b11a5-ef9c-4f72-94c9-6ea74c3f687b","Type":"ContainerDied","Data":"4ce6a80eba720f8d0f0e2a150c530cf0b1d47f34ce4a588b47f4dd8e4425cb3d"} Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.745042 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" event={"ID":"157a7da1-1327-40fc-83f3-30c1ef472c78","Type":"ContainerStarted","Data":"9d332eb69fa9f964ad07a6b5bd96586ac9bac273a18d38a3689cfb087f81d597"} Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.745063 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" event={"ID":"157a7da1-1327-40fc-83f3-30c1ef472c78","Type":"ContainerStarted","Data":"b91baab0f9ad38a6acd26bb4deda05e0df4e3ec2cebaf10560b50708f98357ba"} Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.746289 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.746316 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:09 crc kubenswrapper[4776]: I0128 07:09:09.769624 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" podStartSLOduration=2.7694872740000003 podStartE2EDuration="2.769487274s" podCreationTimestamp="2026-01-28 07:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:09:09.760455229 +0000 UTC m=+1121.176115389" watchObservedRunningTime="2026-01-28 07:09:09.769487274 +0000 UTC m=+1121.185147424" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.189307 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-rh4ps"] Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.194058 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rh4ps" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.204640 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rh4ps"] Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.279566 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-79sh9"] Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.281034 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-79sh9" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.292959 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-79sh9"] Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.312834 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjpmj\" (UniqueName: \"kubernetes.io/projected/c8aa4600-018d-4394-a873-92af1c70b5ba-kube-api-access-xjpmj\") pod \"nova-api-db-create-rh4ps\" (UID: \"c8aa4600-018d-4394-a873-92af1c70b5ba\") " pod="openstack/nova-api-db-create-rh4ps" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.313060 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8aa4600-018d-4394-a873-92af1c70b5ba-operator-scripts\") pod \"nova-api-db-create-rh4ps\" (UID: \"c8aa4600-018d-4394-a873-92af1c70b5ba\") " pod="openstack/nova-api-db-create-rh4ps" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.379485 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-36c2-account-create-update-mdsmj"] Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.380794 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-36c2-account-create-update-mdsmj" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.382661 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.388362 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-4knk8"] Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.390805 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4knk8" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.414695 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-36c2-account-create-update-mdsmj"] Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.415997 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435a9994-8725-46d9-ad26-4cf3179a61e9-operator-scripts\") pod \"nova-api-36c2-account-create-update-mdsmj\" (UID: \"435a9994-8725-46d9-ad26-4cf3179a61e9\") " pod="openstack/nova-api-36c2-account-create-update-mdsmj" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.416086 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5n2b\" (UniqueName: \"kubernetes.io/projected/d17a5a2d-48f9-4ebb-a103-1c9e92d82f41-kube-api-access-k5n2b\") pod \"nova-cell0-db-create-79sh9\" (UID: \"d17a5a2d-48f9-4ebb-a103-1c9e92d82f41\") " pod="openstack/nova-cell0-db-create-79sh9" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.416135 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzzvz\" (UniqueName: \"kubernetes.io/projected/435a9994-8725-46d9-ad26-4cf3179a61e9-kube-api-access-gzzvz\") pod \"nova-api-36c2-account-create-update-mdsmj\" (UID: \"435a9994-8725-46d9-ad26-4cf3179a61e9\") " pod="openstack/nova-api-36c2-account-create-update-mdsmj" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.416296 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8aa4600-018d-4394-a873-92af1c70b5ba-operator-scripts\") pod \"nova-api-db-create-rh4ps\" (UID: \"c8aa4600-018d-4394-a873-92af1c70b5ba\") " pod="openstack/nova-api-db-create-rh4ps" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.416406 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d17a5a2d-48f9-4ebb-a103-1c9e92d82f41-operator-scripts\") pod \"nova-cell0-db-create-79sh9\" (UID: \"d17a5a2d-48f9-4ebb-a103-1c9e92d82f41\") " pod="openstack/nova-cell0-db-create-79sh9" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.416439 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjpmj\" (UniqueName: \"kubernetes.io/projected/c8aa4600-018d-4394-a873-92af1c70b5ba-kube-api-access-xjpmj\") pod \"nova-api-db-create-rh4ps\" (UID: \"c8aa4600-018d-4394-a873-92af1c70b5ba\") " pod="openstack/nova-api-db-create-rh4ps" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.420159 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8aa4600-018d-4394-a873-92af1c70b5ba-operator-scripts\") pod \"nova-api-db-create-rh4ps\" (UID: \"c8aa4600-018d-4394-a873-92af1c70b5ba\") " pod="openstack/nova-api-db-create-rh4ps" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.454975 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4knk8"] Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.461881 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjpmj\" (UniqueName: \"kubernetes.io/projected/c8aa4600-018d-4394-a873-92af1c70b5ba-kube-api-access-xjpmj\") pod \"nova-api-db-create-rh4ps\" (UID: \"c8aa4600-018d-4394-a873-92af1c70b5ba\") " pod="openstack/nova-api-db-create-rh4ps" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.519018 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14baefd8-d74d-44bd-83a9-64f6d8a71fbe-operator-scripts\") pod \"nova-cell1-db-create-4knk8\" (UID: \"14baefd8-d74d-44bd-83a9-64f6d8a71fbe\") " pod="openstack/nova-cell1-db-create-4knk8" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.519069 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzv2l\" (UniqueName: \"kubernetes.io/projected/14baefd8-d74d-44bd-83a9-64f6d8a71fbe-kube-api-access-qzv2l\") pod \"nova-cell1-db-create-4knk8\" (UID: \"14baefd8-d74d-44bd-83a9-64f6d8a71fbe\") " pod="openstack/nova-cell1-db-create-4knk8" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.519110 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435a9994-8725-46d9-ad26-4cf3179a61e9-operator-scripts\") pod \"nova-api-36c2-account-create-update-mdsmj\" (UID: \"435a9994-8725-46d9-ad26-4cf3179a61e9\") " pod="openstack/nova-api-36c2-account-create-update-mdsmj" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.519140 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5n2b\" (UniqueName: \"kubernetes.io/projected/d17a5a2d-48f9-4ebb-a103-1c9e92d82f41-kube-api-access-k5n2b\") pod \"nova-cell0-db-create-79sh9\" (UID: \"d17a5a2d-48f9-4ebb-a103-1c9e92d82f41\") " pod="openstack/nova-cell0-db-create-79sh9" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.519169 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzzvz\" (UniqueName: \"kubernetes.io/projected/435a9994-8725-46d9-ad26-4cf3179a61e9-kube-api-access-gzzvz\") pod \"nova-api-36c2-account-create-update-mdsmj\" (UID: \"435a9994-8725-46d9-ad26-4cf3179a61e9\") " pod="openstack/nova-api-36c2-account-create-update-mdsmj" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.519254 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d17a5a2d-48f9-4ebb-a103-1c9e92d82f41-operator-scripts\") pod \"nova-cell0-db-create-79sh9\" (UID: \"d17a5a2d-48f9-4ebb-a103-1c9e92d82f41\") " pod="openstack/nova-cell0-db-create-79sh9" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.520079 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d17a5a2d-48f9-4ebb-a103-1c9e92d82f41-operator-scripts\") pod \"nova-cell0-db-create-79sh9\" (UID: \"d17a5a2d-48f9-4ebb-a103-1c9e92d82f41\") " pod="openstack/nova-cell0-db-create-79sh9" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.520209 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435a9994-8725-46d9-ad26-4cf3179a61e9-operator-scripts\") pod \"nova-api-36c2-account-create-update-mdsmj\" (UID: \"435a9994-8725-46d9-ad26-4cf3179a61e9\") " pod="openstack/nova-api-36c2-account-create-update-mdsmj" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.524057 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rh4ps" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.535974 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5n2b\" (UniqueName: \"kubernetes.io/projected/d17a5a2d-48f9-4ebb-a103-1c9e92d82f41-kube-api-access-k5n2b\") pod \"nova-cell0-db-create-79sh9\" (UID: \"d17a5a2d-48f9-4ebb-a103-1c9e92d82f41\") " pod="openstack/nova-cell0-db-create-79sh9" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.537111 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzzvz\" (UniqueName: \"kubernetes.io/projected/435a9994-8725-46d9-ad26-4cf3179a61e9-kube-api-access-gzzvz\") pod \"nova-api-36c2-account-create-update-mdsmj\" (UID: \"435a9994-8725-46d9-ad26-4cf3179a61e9\") " pod="openstack/nova-api-36c2-account-create-update-mdsmj" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.586252 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8caa-account-create-update-jcswg"] Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.587570 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8caa-account-create-update-jcswg" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.590106 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.597236 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8caa-account-create-update-jcswg"] Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.624056 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-79sh9" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.624663 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14baefd8-d74d-44bd-83a9-64f6d8a71fbe-operator-scripts\") pod \"nova-cell1-db-create-4knk8\" (UID: \"14baefd8-d74d-44bd-83a9-64f6d8a71fbe\") " pod="openstack/nova-cell1-db-create-4knk8" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.624721 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzv2l\" (UniqueName: \"kubernetes.io/projected/14baefd8-d74d-44bd-83a9-64f6d8a71fbe-kube-api-access-qzv2l\") pod \"nova-cell1-db-create-4knk8\" (UID: \"14baefd8-d74d-44bd-83a9-64f6d8a71fbe\") " pod="openstack/nova-cell1-db-create-4knk8" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.625532 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14baefd8-d74d-44bd-83a9-64f6d8a71fbe-operator-scripts\") pod \"nova-cell1-db-create-4knk8\" (UID: \"14baefd8-d74d-44bd-83a9-64f6d8a71fbe\") " pod="openstack/nova-cell1-db-create-4knk8" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.642594 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzv2l\" (UniqueName: \"kubernetes.io/projected/14baefd8-d74d-44bd-83a9-64f6d8a71fbe-kube-api-access-qzv2l\") pod \"nova-cell1-db-create-4knk8\" (UID: \"14baefd8-d74d-44bd-83a9-64f6d8a71fbe\") " pod="openstack/nova-cell1-db-create-4knk8" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.700570 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-36c2-account-create-update-mdsmj" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.717358 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4knk8" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.726717 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv48t\" (UniqueName: \"kubernetes.io/projected/dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46-kube-api-access-lv48t\") pod \"nova-cell0-8caa-account-create-update-jcswg\" (UID: \"dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46\") " pod="openstack/nova-cell0-8caa-account-create-update-jcswg" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.726832 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46-operator-scripts\") pod \"nova-cell0-8caa-account-create-update-jcswg\" (UID: \"dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46\") " pod="openstack/nova-cell0-8caa-account-create-update-jcswg" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.787126 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c3a3-account-create-update-xq2r2"] Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.788507 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c3a3-account-create-update-xq2r2" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.790370 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.802077 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c3a3-account-create-update-xq2r2"] Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.829052 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv48t\" (UniqueName: \"kubernetes.io/projected/dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46-kube-api-access-lv48t\") pod \"nova-cell0-8caa-account-create-update-jcswg\" (UID: \"dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46\") " pod="openstack/nova-cell0-8caa-account-create-update-jcswg" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.829166 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46-operator-scripts\") pod \"nova-cell0-8caa-account-create-update-jcswg\" (UID: \"dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46\") " pod="openstack/nova-cell0-8caa-account-create-update-jcswg" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.830704 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46-operator-scripts\") pod \"nova-cell0-8caa-account-create-update-jcswg\" (UID: \"dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46\") " pod="openstack/nova-cell0-8caa-account-create-update-jcswg" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.852161 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv48t\" (UniqueName: \"kubernetes.io/projected/dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46-kube-api-access-lv48t\") pod \"nova-cell0-8caa-account-create-update-jcswg\" (UID: \"dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46\") " pod="openstack/nova-cell0-8caa-account-create-update-jcswg" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.931047 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7649c00f-e457-48f3-8d5c-28ca197fb663-operator-scripts\") pod \"nova-cell1-c3a3-account-create-update-xq2r2\" (UID: \"7649c00f-e457-48f3-8d5c-28ca197fb663\") " pod="openstack/nova-cell1-c3a3-account-create-update-xq2r2" Jan 28 07:09:11 crc kubenswrapper[4776]: I0128 07:09:11.931140 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmsct\" (UniqueName: \"kubernetes.io/projected/7649c00f-e457-48f3-8d5c-28ca197fb663-kube-api-access-rmsct\") pod \"nova-cell1-c3a3-account-create-update-xq2r2\" (UID: \"7649c00f-e457-48f3-8d5c-28ca197fb663\") " pod="openstack/nova-cell1-c3a3-account-create-update-xq2r2" Jan 28 07:09:12 crc kubenswrapper[4776]: I0128 07:09:12.009676 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8caa-account-create-update-jcswg" Jan 28 07:09:12 crc kubenswrapper[4776]: I0128 07:09:12.033893 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7649c00f-e457-48f3-8d5c-28ca197fb663-operator-scripts\") pod \"nova-cell1-c3a3-account-create-update-xq2r2\" (UID: \"7649c00f-e457-48f3-8d5c-28ca197fb663\") " pod="openstack/nova-cell1-c3a3-account-create-update-xq2r2" Jan 28 07:09:12 crc kubenswrapper[4776]: I0128 07:09:12.033941 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmsct\" (UniqueName: \"kubernetes.io/projected/7649c00f-e457-48f3-8d5c-28ca197fb663-kube-api-access-rmsct\") pod \"nova-cell1-c3a3-account-create-update-xq2r2\" (UID: \"7649c00f-e457-48f3-8d5c-28ca197fb663\") " pod="openstack/nova-cell1-c3a3-account-create-update-xq2r2" Jan 28 07:09:12 crc kubenswrapper[4776]: I0128 07:09:12.034682 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7649c00f-e457-48f3-8d5c-28ca197fb663-operator-scripts\") pod \"nova-cell1-c3a3-account-create-update-xq2r2\" (UID: \"7649c00f-e457-48f3-8d5c-28ca197fb663\") " pod="openstack/nova-cell1-c3a3-account-create-update-xq2r2" Jan 28 07:09:12 crc kubenswrapper[4776]: I0128 07:09:12.052986 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmsct\" (UniqueName: \"kubernetes.io/projected/7649c00f-e457-48f3-8d5c-28ca197fb663-kube-api-access-rmsct\") pod \"nova-cell1-c3a3-account-create-update-xq2r2\" (UID: \"7649c00f-e457-48f3-8d5c-28ca197fb663\") " pod="openstack/nova-cell1-c3a3-account-create-update-xq2r2" Jan 28 07:09:12 crc kubenswrapper[4776]: I0128 07:09:12.108953 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c3a3-account-create-update-xq2r2" Jan 28 07:09:12 crc kubenswrapper[4776]: I0128 07:09:12.363011 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 28 07:09:13 crc kubenswrapper[4776]: I0128 07:09:13.092781 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:14 crc kubenswrapper[4776]: I0128 07:09:14.814626 4776 generic.go:334] "Generic (PLEG): container finished" podID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerID="7bcbe97d50dfb8d2eed7c5cdc47883a6e058b6fb2a3d9e3b47a75dce8b63edce" exitCode=0 Jan 28 07:09:14 crc kubenswrapper[4776]: I0128 07:09:14.814679 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"521b11a5-ef9c-4f72-94c9-6ea74c3f687b","Type":"ContainerDied","Data":"7bcbe97d50dfb8d2eed7c5cdc47883a6e058b6fb2a3d9e3b47a75dce8b63edce"} Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.674500 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.727591 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-sg-core-conf-yaml\") pod \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.747719 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzxqt\" (UniqueName: \"kubernetes.io/projected/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-kube-api-access-pzxqt\") pod \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.748422 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-run-httpd\") pod \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.748491 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-config-data\") pod \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.748523 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-combined-ca-bundle\") pod \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.748571 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-scripts\") pod \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.748655 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-log-httpd\") pod \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\" (UID: \"521b11a5-ef9c-4f72-94c9-6ea74c3f687b\") " Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.751098 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "521b11a5-ef9c-4f72-94c9-6ea74c3f687b" (UID: "521b11a5-ef9c-4f72-94c9-6ea74c3f687b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.754839 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "521b11a5-ef9c-4f72-94c9-6ea74c3f687b" (UID: "521b11a5-ef9c-4f72-94c9-6ea74c3f687b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.767984 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-scripts" (OuterVolumeSpecName: "scripts") pod "521b11a5-ef9c-4f72-94c9-6ea74c3f687b" (UID: "521b11a5-ef9c-4f72-94c9-6ea74c3f687b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.780734 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-kube-api-access-pzxqt" (OuterVolumeSpecName: "kube-api-access-pzxqt") pod "521b11a5-ef9c-4f72-94c9-6ea74c3f687b" (UID: "521b11a5-ef9c-4f72-94c9-6ea74c3f687b"). InnerVolumeSpecName "kube-api-access-pzxqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.803738 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "521b11a5-ef9c-4f72-94c9-6ea74c3f687b" (UID: "521b11a5-ef9c-4f72-94c9-6ea74c3f687b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.846187 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"521b11a5-ef9c-4f72-94c9-6ea74c3f687b","Type":"ContainerDied","Data":"0b23a08b5d9ae645c69c77818c33a9dd6256a47795bb8354a27291357cceea98"} Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.846567 4776 scope.go:117] "RemoveContainer" containerID="41368e882e48f9d639a73241f1cb38230da29a960d7f1544dfb8f54639d4642e" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.846860 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.853082 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"abb12d44-fb9e-4ac4-95ad-a82606ff0709","Type":"ContainerStarted","Data":"29c87f118d16c27f020351a37bf025a8bbebc6027b86536314bd269dafb6c559"} Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.854731 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.854836 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.854931 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.855020 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzxqt\" (UniqueName: \"kubernetes.io/projected/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-kube-api-access-pzxqt\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.855101 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.876520 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.126768515 podStartE2EDuration="15.876502336s" podCreationTimestamp="2026-01-28 07:09:01 +0000 UTC" firstStartedPulling="2026-01-28 07:09:02.670014315 +0000 UTC m=+1114.085674475" lastFinishedPulling="2026-01-28 07:09:16.419748136 +0000 UTC m=+1127.835408296" observedRunningTime="2026-01-28 07:09:16.875070497 +0000 UTC m=+1128.290730657" watchObservedRunningTime="2026-01-28 07:09:16.876502336 +0000 UTC m=+1128.292162496" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.887780 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "521b11a5-ef9c-4f72-94c9-6ea74c3f687b" (UID: "521b11a5-ef9c-4f72-94c9-6ea74c3f687b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.903468 4776 scope.go:117] "RemoveContainer" containerID="a6c345e15975fc0ea1f2d97db3df8e1cfce38515f9b1553da0ef391a6afece8b" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.940527 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c7f79f5b8-2xn7l" podUID="e5180ed1-0d82-4c44-aed4-3f3a5b34af93" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.160:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.160:8443: connect: connection refused" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.940684 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.945624 4776 scope.go:117] "RemoveContainer" containerID="7bcbe97d50dfb8d2eed7c5cdc47883a6e058b6fb2a3d9e3b47a75dce8b63edce" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.957517 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.986773 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-config-data" (OuterVolumeSpecName: "config-data") pod "521b11a5-ef9c-4f72-94c9-6ea74c3f687b" (UID: "521b11a5-ef9c-4f72-94c9-6ea74c3f687b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:16 crc kubenswrapper[4776]: I0128 07:09:16.994008 4776 scope.go:117] "RemoveContainer" containerID="4ce6a80eba720f8d0f0e2a150c530cf0b1d47f34ce4a588b47f4dd8e4425cb3d" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.059357 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/521b11a5-ef9c-4f72-94c9-6ea74c3f687b-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.062185 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-36c2-account-create-update-mdsmj"] Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.081332 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8caa-account-create-update-jcswg"] Jan 28 07:09:17 crc kubenswrapper[4776]: W0128 07:09:17.089330 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd17a5a2d_48f9_4ebb_a103_1c9e92d82f41.slice/crio-14bf81bfe30d10c950ac339a8c69bbfb89ab0cfd84c57be024a87f547e4d06b9 WatchSource:0}: Error finding container 14bf81bfe30d10c950ac339a8c69bbfb89ab0cfd84c57be024a87f547e4d06b9: Status 404 returned error can't find the container with id 14bf81bfe30d10c950ac339a8c69bbfb89ab0cfd84c57be024a87f547e4d06b9 Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.100346 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c3a3-account-create-update-xq2r2"] Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.141732 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-79sh9"] Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.231693 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.262298 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.275742 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:09:17 crc kubenswrapper[4776]: E0128 07:09:17.276132 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerName="ceilometer-notification-agent" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.276155 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerName="ceilometer-notification-agent" Jan 28 07:09:17 crc kubenswrapper[4776]: E0128 07:09:17.276183 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerName="proxy-httpd" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.276189 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerName="proxy-httpd" Jan 28 07:09:17 crc kubenswrapper[4776]: E0128 07:09:17.276207 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerName="ceilometer-central-agent" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.276214 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerName="ceilometer-central-agent" Jan 28 07:09:17 crc kubenswrapper[4776]: E0128 07:09:17.276228 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerName="sg-core" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.276233 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerName="sg-core" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.276430 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerName="ceilometer-notification-agent" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.276446 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerName="proxy-httpd" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.276456 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerName="ceilometer-central-agent" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.276469 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" containerName="sg-core" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.278601 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.281590 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.282901 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.288106 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4knk8"] Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.323356 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="521b11a5-ef9c-4f72-94c9-6ea74c3f687b" path="/var/lib/kubelet/pods/521b11a5-ef9c-4f72-94c9-6ea74c3f687b/volumes" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.324180 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.324211 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rh4ps"] Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.392255 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148c5aca-4eba-4912-8d65-cb8e4820948b-run-httpd\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.392315 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h26zj\" (UniqueName: \"kubernetes.io/projected/148c5aca-4eba-4912-8d65-cb8e4820948b-kube-api-access-h26zj\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.392339 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.392505 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-scripts\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.392595 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-config-data\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.392612 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.392632 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148c5aca-4eba-4912-8d65-cb8e4820948b-log-httpd\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.494459 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-scripts\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.494943 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-config-data\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.495021 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.495089 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148c5aca-4eba-4912-8d65-cb8e4820948b-log-httpd\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.495177 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148c5aca-4eba-4912-8d65-cb8e4820948b-run-httpd\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.495251 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h26zj\" (UniqueName: \"kubernetes.io/projected/148c5aca-4eba-4912-8d65-cb8e4820948b-kube-api-access-h26zj\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.495321 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.495727 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148c5aca-4eba-4912-8d65-cb8e4820948b-log-httpd\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.496081 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148c5aca-4eba-4912-8d65-cb8e4820948b-run-httpd\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.499409 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.500048 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.500079 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-scripts\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.500179 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-config-data\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.514665 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h26zj\" (UniqueName: \"kubernetes.io/projected/148c5aca-4eba-4912-8d65-cb8e4820948b-kube-api-access-h26zj\") pod \"ceilometer-0\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.616308 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.908526 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c3a3-account-create-update-xq2r2" event={"ID":"7649c00f-e457-48f3-8d5c-28ca197fb663","Type":"ContainerStarted","Data":"dedf562340912036619cca369ad682470e50dd8d4ea3311e00b8506d2395f06a"} Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.908844 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c3a3-account-create-update-xq2r2" event={"ID":"7649c00f-e457-48f3-8d5c-28ca197fb663","Type":"ContainerStarted","Data":"a0e3f81b8101a7aa05f992bdc93cc314b4bdbef479965b5dbd299401ba58ce6e"} Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.934791 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-c3a3-account-create-update-xq2r2" podStartSLOduration=6.934764869 podStartE2EDuration="6.934764869s" podCreationTimestamp="2026-01-28 07:09:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:09:17.931375148 +0000 UTC m=+1129.347035308" watchObservedRunningTime="2026-01-28 07:09:17.934764869 +0000 UTC m=+1129.350425039" Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.955736 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rh4ps" event={"ID":"c8aa4600-018d-4394-a873-92af1c70b5ba","Type":"ContainerStarted","Data":"a76f39a347c923f93cb7e5af13b816e4676bef72ee179caf1cbdbedbb7eb8b24"} Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.955943 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rh4ps" event={"ID":"c8aa4600-018d-4394-a873-92af1c70b5ba","Type":"ContainerStarted","Data":"b89b62723c7682bfafb63bf1c0414763b31a9fd842c603b63636fa8f7fdeb92a"} Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.962937 4776 generic.go:334] "Generic (PLEG): container finished" podID="435a9994-8725-46d9-ad26-4cf3179a61e9" containerID="563abec8d38830b597d10b9c43b52d90dcbaa959229193230499831920913b27" exitCode=0 Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.962996 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-36c2-account-create-update-mdsmj" event={"ID":"435a9994-8725-46d9-ad26-4cf3179a61e9","Type":"ContainerDied","Data":"563abec8d38830b597d10b9c43b52d90dcbaa959229193230499831920913b27"} Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.963020 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-36c2-account-create-update-mdsmj" event={"ID":"435a9994-8725-46d9-ad26-4cf3179a61e9","Type":"ContainerStarted","Data":"f413dc9cd5068db16ea782d8882f5782c08ef38a48017e0df60fde8d47f25411"} Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.964188 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-79sh9" event={"ID":"d17a5a2d-48f9-4ebb-a103-1c9e92d82f41","Type":"ContainerStarted","Data":"a5021333bc8a17baae09ea0b334e2701df8bbf478ef05e433929c14053ec8de5"} Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.964210 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-79sh9" event={"ID":"d17a5a2d-48f9-4ebb-a103-1c9e92d82f41","Type":"ContainerStarted","Data":"14bf81bfe30d10c950ac339a8c69bbfb89ab0cfd84c57be024a87f547e4d06b9"} Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.981224 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8caa-account-create-update-jcswg" event={"ID":"dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46","Type":"ContainerStarted","Data":"76f3490238c1debb347ca950de1a442bd8b51a2d3664cc77098990367a729c10"} Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.981272 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8caa-account-create-update-jcswg" event={"ID":"dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46","Type":"ContainerStarted","Data":"0d4e37368feac924515c8b693ca17ecb022db94421a588f46b9dc6b9dc9425b4"} Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.983304 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4knk8" event={"ID":"14baefd8-d74d-44bd-83a9-64f6d8a71fbe","Type":"ContainerStarted","Data":"d8f2e34a8dac1da7497496239cdfac06c222fef456fa7664b5899176a9259827"} Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.983353 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4knk8" event={"ID":"14baefd8-d74d-44bd-83a9-64f6d8a71fbe","Type":"ContainerStarted","Data":"24130b6eb356d766750999a8cefb05b147a836dc59f49e8123cd2c3528461fa3"} Jan 28 07:09:17 crc kubenswrapper[4776]: I0128 07:09:17.988630 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-rh4ps" podStartSLOduration=6.9886108799999995 podStartE2EDuration="6.98861088s" podCreationTimestamp="2026-01-28 07:09:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:09:17.975819993 +0000 UTC m=+1129.391480153" watchObservedRunningTime="2026-01-28 07:09:17.98861088 +0000 UTC m=+1129.404271030" Jan 28 07:09:18 crc kubenswrapper[4776]: I0128 07:09:18.005042 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-79sh9" podStartSLOduration=7.005025505 podStartE2EDuration="7.005025505s" podCreationTimestamp="2026-01-28 07:09:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:09:17.99711972 +0000 UTC m=+1129.412779890" watchObservedRunningTime="2026-01-28 07:09:18.005025505 +0000 UTC m=+1129.420685665" Jan 28 07:09:18 crc kubenswrapper[4776]: I0128 07:09:18.080288 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-4knk8" podStartSLOduration=7.080268293 podStartE2EDuration="7.080268293s" podCreationTimestamp="2026-01-28 07:09:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:09:18.034858383 +0000 UTC m=+1129.450518533" watchObservedRunningTime="2026-01-28 07:09:18.080268293 +0000 UTC m=+1129.495928453" Jan 28 07:09:18 crc kubenswrapper[4776]: I0128 07:09:18.083492 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-8caa-account-create-update-jcswg" podStartSLOduration=7.083482331 podStartE2EDuration="7.083482331s" podCreationTimestamp="2026-01-28 07:09:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:09:18.081751394 +0000 UTC m=+1129.497411554" watchObservedRunningTime="2026-01-28 07:09:18.083482331 +0000 UTC m=+1129.499142491" Jan 28 07:09:18 crc kubenswrapper[4776]: I0128 07:09:18.092419 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-76c7dcc8f9-s7l5r" Jan 28 07:09:18 crc kubenswrapper[4776]: I0128 07:09:18.254067 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:09:18 crc kubenswrapper[4776]: I0128 07:09:18.552627 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:18.999864 4776 generic.go:334] "Generic (PLEG): container finished" podID="14baefd8-d74d-44bd-83a9-64f6d8a71fbe" containerID="d8f2e34a8dac1da7497496239cdfac06c222fef456fa7664b5899176a9259827" exitCode=0 Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:19.000212 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4knk8" event={"ID":"14baefd8-d74d-44bd-83a9-64f6d8a71fbe","Type":"ContainerDied","Data":"d8f2e34a8dac1da7497496239cdfac06c222fef456fa7664b5899176a9259827"} Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:19.004401 4776 generic.go:334] "Generic (PLEG): container finished" podID="7649c00f-e457-48f3-8d5c-28ca197fb663" containerID="dedf562340912036619cca369ad682470e50dd8d4ea3311e00b8506d2395f06a" exitCode=0 Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:19.004526 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c3a3-account-create-update-xq2r2" event={"ID":"7649c00f-e457-48f3-8d5c-28ca197fb663","Type":"ContainerDied","Data":"dedf562340912036619cca369ad682470e50dd8d4ea3311e00b8506d2395f06a"} Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:19.007075 4776 generic.go:334] "Generic (PLEG): container finished" podID="c8aa4600-018d-4394-a873-92af1c70b5ba" containerID="a76f39a347c923f93cb7e5af13b816e4676bef72ee179caf1cbdbedbb7eb8b24" exitCode=0 Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:19.007171 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rh4ps" event={"ID":"c8aa4600-018d-4394-a873-92af1c70b5ba","Type":"ContainerDied","Data":"a76f39a347c923f93cb7e5af13b816e4676bef72ee179caf1cbdbedbb7eb8b24"} Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:19.009339 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148c5aca-4eba-4912-8d65-cb8e4820948b","Type":"ContainerStarted","Data":"9d86522de39d71d513eef9233e51f150be8d24a3c9fdc02b02b6f6a68a21f3e4"} Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:19.013621 4776 generic.go:334] "Generic (PLEG): container finished" podID="d17a5a2d-48f9-4ebb-a103-1c9e92d82f41" containerID="a5021333bc8a17baae09ea0b334e2701df8bbf478ef05e433929c14053ec8de5" exitCode=0 Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:19.013760 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-79sh9" event={"ID":"d17a5a2d-48f9-4ebb-a103-1c9e92d82f41","Type":"ContainerDied","Data":"a5021333bc8a17baae09ea0b334e2701df8bbf478ef05e433929c14053ec8de5"} Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:19.015591 4776 generic.go:334] "Generic (PLEG): container finished" podID="dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46" containerID="76f3490238c1debb347ca950de1a442bd8b51a2d3664cc77098990367a729c10" exitCode=0 Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:19.015827 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8caa-account-create-update-jcswg" event={"ID":"dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46","Type":"ContainerDied","Data":"76f3490238c1debb347ca950de1a442bd8b51a2d3664cc77098990367a729c10"} Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:19.465342 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-36c2-account-create-update-mdsmj" Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:19.561470 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzzvz\" (UniqueName: \"kubernetes.io/projected/435a9994-8725-46d9-ad26-4cf3179a61e9-kube-api-access-gzzvz\") pod \"435a9994-8725-46d9-ad26-4cf3179a61e9\" (UID: \"435a9994-8725-46d9-ad26-4cf3179a61e9\") " Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:19.561711 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435a9994-8725-46d9-ad26-4cf3179a61e9-operator-scripts\") pod \"435a9994-8725-46d9-ad26-4cf3179a61e9\" (UID: \"435a9994-8725-46d9-ad26-4cf3179a61e9\") " Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:19.562466 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/435a9994-8725-46d9-ad26-4cf3179a61e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "435a9994-8725-46d9-ad26-4cf3179a61e9" (UID: "435a9994-8725-46d9-ad26-4cf3179a61e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:19.566222 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/435a9994-8725-46d9-ad26-4cf3179a61e9-kube-api-access-gzzvz" (OuterVolumeSpecName: "kube-api-access-gzzvz") pod "435a9994-8725-46d9-ad26-4cf3179a61e9" (UID: "435a9994-8725-46d9-ad26-4cf3179a61e9"). InnerVolumeSpecName "kube-api-access-gzzvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:19.664537 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435a9994-8725-46d9-ad26-4cf3179a61e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:19 crc kubenswrapper[4776]: I0128 07:09:19.664590 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzzvz\" (UniqueName: \"kubernetes.io/projected/435a9994-8725-46d9-ad26-4cf3179a61e9-kube-api-access-gzzvz\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.028582 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148c5aca-4eba-4912-8d65-cb8e4820948b","Type":"ContainerStarted","Data":"3824ca3c9fe1ad07df53349ab8aa84b9d74c2b08b0e25250aa37f45ddda014d4"} Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.028999 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148c5aca-4eba-4912-8d65-cb8e4820948b","Type":"ContainerStarted","Data":"bde59a5cd27ad9e13ddd8ab2a06ec4ca29d0389e0c72f387cb7bfb0c981d90af"} Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.030592 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-36c2-account-create-update-mdsmj" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.030602 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-36c2-account-create-update-mdsmj" event={"ID":"435a9994-8725-46d9-ad26-4cf3179a61e9","Type":"ContainerDied","Data":"f413dc9cd5068db16ea782d8882f5782c08ef38a48017e0df60fde8d47f25411"} Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.030631 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f413dc9cd5068db16ea782d8882f5782c08ef38a48017e0df60fde8d47f25411" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.324273 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8caa-account-create-update-jcswg" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.487236 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv48t\" (UniqueName: \"kubernetes.io/projected/dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46-kube-api-access-lv48t\") pod \"dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46\" (UID: \"dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46\") " Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.488131 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46-operator-scripts\") pod \"dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46\" (UID: \"dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46\") " Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.488735 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46" (UID: "dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.489837 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.507355 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46-kube-api-access-lv48t" (OuterVolumeSpecName: "kube-api-access-lv48t") pod "dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46" (UID: "dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46"). InnerVolumeSpecName "kube-api-access-lv48t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.591368 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv48t\" (UniqueName: \"kubernetes.io/projected/dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46-kube-api-access-lv48t\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.703853 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c3a3-account-create-update-xq2r2" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.717559 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4knk8" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.726696 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rh4ps" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.734011 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-79sh9" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.796249 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmsct\" (UniqueName: \"kubernetes.io/projected/7649c00f-e457-48f3-8d5c-28ca197fb663-kube-api-access-rmsct\") pod \"7649c00f-e457-48f3-8d5c-28ca197fb663\" (UID: \"7649c00f-e457-48f3-8d5c-28ca197fb663\") " Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.796373 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7649c00f-e457-48f3-8d5c-28ca197fb663-operator-scripts\") pod \"7649c00f-e457-48f3-8d5c-28ca197fb663\" (UID: \"7649c00f-e457-48f3-8d5c-28ca197fb663\") " Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.797070 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7649c00f-e457-48f3-8d5c-28ca197fb663-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7649c00f-e457-48f3-8d5c-28ca197fb663" (UID: "7649c00f-e457-48f3-8d5c-28ca197fb663"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.812900 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7649c00f-e457-48f3-8d5c-28ca197fb663-kube-api-access-rmsct" (OuterVolumeSpecName: "kube-api-access-rmsct") pod "7649c00f-e457-48f3-8d5c-28ca197fb663" (UID: "7649c00f-e457-48f3-8d5c-28ca197fb663"). InnerVolumeSpecName "kube-api-access-rmsct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.898076 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5n2b\" (UniqueName: \"kubernetes.io/projected/d17a5a2d-48f9-4ebb-a103-1c9e92d82f41-kube-api-access-k5n2b\") pod \"d17a5a2d-48f9-4ebb-a103-1c9e92d82f41\" (UID: \"d17a5a2d-48f9-4ebb-a103-1c9e92d82f41\") " Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.898191 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d17a5a2d-48f9-4ebb-a103-1c9e92d82f41-operator-scripts\") pod \"d17a5a2d-48f9-4ebb-a103-1c9e92d82f41\" (UID: \"d17a5a2d-48f9-4ebb-a103-1c9e92d82f41\") " Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.898570 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17a5a2d-48f9-4ebb-a103-1c9e92d82f41-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d17a5a2d-48f9-4ebb-a103-1c9e92d82f41" (UID: "d17a5a2d-48f9-4ebb-a103-1c9e92d82f41"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.898634 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzv2l\" (UniqueName: \"kubernetes.io/projected/14baefd8-d74d-44bd-83a9-64f6d8a71fbe-kube-api-access-qzv2l\") pod \"14baefd8-d74d-44bd-83a9-64f6d8a71fbe\" (UID: \"14baefd8-d74d-44bd-83a9-64f6d8a71fbe\") " Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.898670 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjpmj\" (UniqueName: \"kubernetes.io/projected/c8aa4600-018d-4394-a873-92af1c70b5ba-kube-api-access-xjpmj\") pod \"c8aa4600-018d-4394-a873-92af1c70b5ba\" (UID: \"c8aa4600-018d-4394-a873-92af1c70b5ba\") " Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.899001 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14baefd8-d74d-44bd-83a9-64f6d8a71fbe-operator-scripts\") pod \"14baefd8-d74d-44bd-83a9-64f6d8a71fbe\" (UID: \"14baefd8-d74d-44bd-83a9-64f6d8a71fbe\") " Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.899065 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8aa4600-018d-4394-a873-92af1c70b5ba-operator-scripts\") pod \"c8aa4600-018d-4394-a873-92af1c70b5ba\" (UID: \"c8aa4600-018d-4394-a873-92af1c70b5ba\") " Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.899475 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7649c00f-e457-48f3-8d5c-28ca197fb663-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.899491 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d17a5a2d-48f9-4ebb-a103-1c9e92d82f41-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.899500 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmsct\" (UniqueName: \"kubernetes.io/projected/7649c00f-e457-48f3-8d5c-28ca197fb663-kube-api-access-rmsct\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.899948 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8aa4600-018d-4394-a873-92af1c70b5ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c8aa4600-018d-4394-a873-92af1c70b5ba" (UID: "c8aa4600-018d-4394-a873-92af1c70b5ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.900588 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14baefd8-d74d-44bd-83a9-64f6d8a71fbe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "14baefd8-d74d-44bd-83a9-64f6d8a71fbe" (UID: "14baefd8-d74d-44bd-83a9-64f6d8a71fbe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.902938 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14baefd8-d74d-44bd-83a9-64f6d8a71fbe-kube-api-access-qzv2l" (OuterVolumeSpecName: "kube-api-access-qzv2l") pod "14baefd8-d74d-44bd-83a9-64f6d8a71fbe" (UID: "14baefd8-d74d-44bd-83a9-64f6d8a71fbe"). InnerVolumeSpecName "kube-api-access-qzv2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.905669 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d17a5a2d-48f9-4ebb-a103-1c9e92d82f41-kube-api-access-k5n2b" (OuterVolumeSpecName: "kube-api-access-k5n2b") pod "d17a5a2d-48f9-4ebb-a103-1c9e92d82f41" (UID: "d17a5a2d-48f9-4ebb-a103-1c9e92d82f41"). InnerVolumeSpecName "kube-api-access-k5n2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:09:20 crc kubenswrapper[4776]: I0128 07:09:20.905937 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8aa4600-018d-4394-a873-92af1c70b5ba-kube-api-access-xjpmj" (OuterVolumeSpecName: "kube-api-access-xjpmj") pod "c8aa4600-018d-4394-a873-92af1c70b5ba" (UID: "c8aa4600-018d-4394-a873-92af1c70b5ba"). InnerVolumeSpecName "kube-api-access-xjpmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.001786 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8aa4600-018d-4394-a873-92af1c70b5ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.002030 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5n2b\" (UniqueName: \"kubernetes.io/projected/d17a5a2d-48f9-4ebb-a103-1c9e92d82f41-kube-api-access-k5n2b\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.002106 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzv2l\" (UniqueName: \"kubernetes.io/projected/14baefd8-d74d-44bd-83a9-64f6d8a71fbe-kube-api-access-qzv2l\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.002175 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjpmj\" (UniqueName: \"kubernetes.io/projected/c8aa4600-018d-4394-a873-92af1c70b5ba-kube-api-access-xjpmj\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.002242 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14baefd8-d74d-44bd-83a9-64f6d8a71fbe-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.068604 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8caa-account-create-update-jcswg" event={"ID":"dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46","Type":"ContainerDied","Data":"0d4e37368feac924515c8b693ca17ecb022db94421a588f46b9dc6b9dc9425b4"} Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.068672 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d4e37368feac924515c8b693ca17ecb022db94421a588f46b9dc6b9dc9425b4" Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.068617 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8caa-account-create-update-jcswg" Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.079459 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4knk8" event={"ID":"14baefd8-d74d-44bd-83a9-64f6d8a71fbe","Type":"ContainerDied","Data":"24130b6eb356d766750999a8cefb05b147a836dc59f49e8123cd2c3528461fa3"} Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.079497 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24130b6eb356d766750999a8cefb05b147a836dc59f49e8123cd2c3528461fa3" Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.079581 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4knk8" Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.081586 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c3a3-account-create-update-xq2r2" event={"ID":"7649c00f-e457-48f3-8d5c-28ca197fb663","Type":"ContainerDied","Data":"a0e3f81b8101a7aa05f992bdc93cc314b4bdbef479965b5dbd299401ba58ce6e"} Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.081614 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0e3f81b8101a7aa05f992bdc93cc314b4bdbef479965b5dbd299401ba58ce6e" Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.081745 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c3a3-account-create-update-xq2r2" Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.089601 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rh4ps" Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.089598 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rh4ps" event={"ID":"c8aa4600-018d-4394-a873-92af1c70b5ba","Type":"ContainerDied","Data":"b89b62723c7682bfafb63bf1c0414763b31a9fd842c603b63636fa8f7fdeb92a"} Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.090202 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b89b62723c7682bfafb63bf1c0414763b31a9fd842c603b63636fa8f7fdeb92a" Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.101540 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148c5aca-4eba-4912-8d65-cb8e4820948b","Type":"ContainerStarted","Data":"14360c6e93ecb99c455232c35e2c1d9624ab8bcc9268e8d8feb87bc9aac52f0d"} Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.115366 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-79sh9" event={"ID":"d17a5a2d-48f9-4ebb-a103-1c9e92d82f41","Type":"ContainerDied","Data":"14bf81bfe30d10c950ac339a8c69bbfb89ab0cfd84c57be024a87f547e4d06b9"} Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.115412 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14bf81bfe30d10c950ac339a8c69bbfb89ab0cfd84c57be024a87f547e4d06b9" Jan 28 07:09:21 crc kubenswrapper[4776]: I0128 07:09:21.115498 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-79sh9" Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.126946 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148c5aca-4eba-4912-8d65-cb8e4820948b","Type":"ContainerStarted","Data":"0bd90e69b57794f14737f05121f26c89c2c13fb658a3f921f6d097b00df684aa"} Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.127317 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.127099 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerName="sg-core" containerID="cri-o://14360c6e93ecb99c455232c35e2c1d9624ab8bcc9268e8d8feb87bc9aac52f0d" gracePeriod=30 Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.127067 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerName="ceilometer-central-agent" containerID="cri-o://3824ca3c9fe1ad07df53349ab8aa84b9d74c2b08b0e25250aa37f45ddda014d4" gracePeriod=30 Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.127165 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerName="ceilometer-notification-agent" containerID="cri-o://bde59a5cd27ad9e13ddd8ab2a06ec4ca29d0389e0c72f387cb7bfb0c981d90af" gracePeriod=30 Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.127114 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerName="proxy-httpd" containerID="cri-o://0bd90e69b57794f14737f05121f26c89c2c13fb658a3f921f6d097b00df684aa" gracePeriod=30 Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.160027 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.5973306269999998 podStartE2EDuration="5.160004743s" podCreationTimestamp="2026-01-28 07:09:17 +0000 UTC" firstStartedPulling="2026-01-28 07:09:18.267997981 +0000 UTC m=+1129.683658131" lastFinishedPulling="2026-01-28 07:09:21.830672087 +0000 UTC m=+1133.246332247" observedRunningTime="2026-01-28 07:09:22.151451802 +0000 UTC m=+1133.567111962" watchObservedRunningTime="2026-01-28 07:09:22.160004743 +0000 UTC m=+1133.575664903" Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.814998 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.940495 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-horizon-tls-certs\") pod \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.941140 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-logs\") pod \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.941219 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-config-data\") pod \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.941236 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-scripts\") pod \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.941449 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-combined-ca-bundle\") pod \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.941598 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-horizon-secret-key\") pod \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.941652 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drrzk\" (UniqueName: \"kubernetes.io/projected/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-kube-api-access-drrzk\") pod \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\" (UID: \"e5180ed1-0d82-4c44-aed4-3f3a5b34af93\") " Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.941712 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-logs" (OuterVolumeSpecName: "logs") pod "e5180ed1-0d82-4c44-aed4-3f3a5b34af93" (UID: "e5180ed1-0d82-4c44-aed4-3f3a5b34af93"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.942057 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.950735 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-kube-api-access-drrzk" (OuterVolumeSpecName: "kube-api-access-drrzk") pod "e5180ed1-0d82-4c44-aed4-3f3a5b34af93" (UID: "e5180ed1-0d82-4c44-aed4-3f3a5b34af93"). InnerVolumeSpecName "kube-api-access-drrzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.962123 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e5180ed1-0d82-4c44-aed4-3f3a5b34af93" (UID: "e5180ed1-0d82-4c44-aed4-3f3a5b34af93"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.972367 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-scripts" (OuterVolumeSpecName: "scripts") pod "e5180ed1-0d82-4c44-aed4-3f3a5b34af93" (UID: "e5180ed1-0d82-4c44-aed4-3f3a5b34af93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.985276 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-config-data" (OuterVolumeSpecName: "config-data") pod "e5180ed1-0d82-4c44-aed4-3f3a5b34af93" (UID: "e5180ed1-0d82-4c44-aed4-3f3a5b34af93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:09:22 crc kubenswrapper[4776]: I0128 07:09:22.990427 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5180ed1-0d82-4c44-aed4-3f3a5b34af93" (UID: "e5180ed1-0d82-4c44-aed4-3f3a5b34af93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.016327 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "e5180ed1-0d82-4c44-aed4-3f3a5b34af93" (UID: "e5180ed1-0d82-4c44-aed4-3f3a5b34af93"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.043945 4776 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.043978 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drrzk\" (UniqueName: \"kubernetes.io/projected/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-kube-api-access-drrzk\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.043991 4776 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.044001 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.044009 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.044017 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5180ed1-0d82-4c44-aed4-3f3a5b34af93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.135876 4776 generic.go:334] "Generic (PLEG): container finished" podID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerID="14360c6e93ecb99c455232c35e2c1d9624ab8bcc9268e8d8feb87bc9aac52f0d" exitCode=2 Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.136564 4776 generic.go:334] "Generic (PLEG): container finished" podID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerID="bde59a5cd27ad9e13ddd8ab2a06ec4ca29d0389e0c72f387cb7bfb0c981d90af" exitCode=0 Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.136011 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148c5aca-4eba-4912-8d65-cb8e4820948b","Type":"ContainerDied","Data":"14360c6e93ecb99c455232c35e2c1d9624ab8bcc9268e8d8feb87bc9aac52f0d"} Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.136748 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148c5aca-4eba-4912-8d65-cb8e4820948b","Type":"ContainerDied","Data":"bde59a5cd27ad9e13ddd8ab2a06ec4ca29d0389e0c72f387cb7bfb0c981d90af"} Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.138395 4776 generic.go:334] "Generic (PLEG): container finished" podID="e5180ed1-0d82-4c44-aed4-3f3a5b34af93" containerID="97736af916b231252a9e5e9c2caf625fa76ea7906e2816fcdf86b6aad63ee7e3" exitCode=137 Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.138493 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c7f79f5b8-2xn7l" event={"ID":"e5180ed1-0d82-4c44-aed4-3f3a5b34af93","Type":"ContainerDied","Data":"97736af916b231252a9e5e9c2caf625fa76ea7906e2816fcdf86b6aad63ee7e3"} Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.138574 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c7f79f5b8-2xn7l" event={"ID":"e5180ed1-0d82-4c44-aed4-3f3a5b34af93","Type":"ContainerDied","Data":"595c2365ffce3309df31267fa761d2f42a67961a567faa802ace558f1962b20c"} Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.138612 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c7f79f5b8-2xn7l" Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.138701 4776 scope.go:117] "RemoveContainer" containerID="71bf08e3ca2d98d43e5538be797edda0f652fb4b5f58ef46a467d0df066fe22d" Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.171668 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c7f79f5b8-2xn7l"] Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.180012 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c7f79f5b8-2xn7l"] Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.305149 4776 scope.go:117] "RemoveContainer" containerID="97736af916b231252a9e5e9c2caf625fa76ea7906e2816fcdf86b6aad63ee7e3" Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.323442 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5180ed1-0d82-4c44-aed4-3f3a5b34af93" path="/var/lib/kubelet/pods/e5180ed1-0d82-4c44-aed4-3f3a5b34af93/volumes" Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.325800 4776 scope.go:117] "RemoveContainer" containerID="71bf08e3ca2d98d43e5538be797edda0f652fb4b5f58ef46a467d0df066fe22d" Jan 28 07:09:23 crc kubenswrapper[4776]: E0128 07:09:23.326281 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71bf08e3ca2d98d43e5538be797edda0f652fb4b5f58ef46a467d0df066fe22d\": container with ID starting with 71bf08e3ca2d98d43e5538be797edda0f652fb4b5f58ef46a467d0df066fe22d not found: ID does not exist" containerID="71bf08e3ca2d98d43e5538be797edda0f652fb4b5f58ef46a467d0df066fe22d" Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.326312 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71bf08e3ca2d98d43e5538be797edda0f652fb4b5f58ef46a467d0df066fe22d"} err="failed to get container status \"71bf08e3ca2d98d43e5538be797edda0f652fb4b5f58ef46a467d0df066fe22d\": rpc error: code = NotFound desc = could not find container \"71bf08e3ca2d98d43e5538be797edda0f652fb4b5f58ef46a467d0df066fe22d\": container with ID starting with 71bf08e3ca2d98d43e5538be797edda0f652fb4b5f58ef46a467d0df066fe22d not found: ID does not exist" Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.326339 4776 scope.go:117] "RemoveContainer" containerID="97736af916b231252a9e5e9c2caf625fa76ea7906e2816fcdf86b6aad63ee7e3" Jan 28 07:09:23 crc kubenswrapper[4776]: E0128 07:09:23.326983 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97736af916b231252a9e5e9c2caf625fa76ea7906e2816fcdf86b6aad63ee7e3\": container with ID starting with 97736af916b231252a9e5e9c2caf625fa76ea7906e2816fcdf86b6aad63ee7e3 not found: ID does not exist" containerID="97736af916b231252a9e5e9c2caf625fa76ea7906e2816fcdf86b6aad63ee7e3" Jan 28 07:09:23 crc kubenswrapper[4776]: I0128 07:09:23.327007 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97736af916b231252a9e5e9c2caf625fa76ea7906e2816fcdf86b6aad63ee7e3"} err="failed to get container status \"97736af916b231252a9e5e9c2caf625fa76ea7906e2816fcdf86b6aad63ee7e3\": rpc error: code = NotFound desc = could not find container \"97736af916b231252a9e5e9c2caf625fa76ea7906e2816fcdf86b6aad63ee7e3\": container with ID starting with 97736af916b231252a9e5e9c2caf625fa76ea7906e2816fcdf86b6aad63ee7e3 not found: ID does not exist" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.174452 4776 generic.go:334] "Generic (PLEG): container finished" podID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerID="3824ca3c9fe1ad07df53349ab8aa84b9d74c2b08b0e25250aa37f45ddda014d4" exitCode=0 Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.185775 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148c5aca-4eba-4912-8d65-cb8e4820948b","Type":"ContainerDied","Data":"3824ca3c9fe1ad07df53349ab8aa84b9d74c2b08b0e25250aa37f45ddda014d4"} Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.649071 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vpps7"] Jan 28 07:09:26 crc kubenswrapper[4776]: E0128 07:09:26.649508 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8aa4600-018d-4394-a873-92af1c70b5ba" containerName="mariadb-database-create" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.649527 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8aa4600-018d-4394-a873-92af1c70b5ba" containerName="mariadb-database-create" Jan 28 07:09:26 crc kubenswrapper[4776]: E0128 07:09:26.649562 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="435a9994-8725-46d9-ad26-4cf3179a61e9" containerName="mariadb-account-create-update" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.649571 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="435a9994-8725-46d9-ad26-4cf3179a61e9" containerName="mariadb-account-create-update" Jan 28 07:09:26 crc kubenswrapper[4776]: E0128 07:09:26.649583 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46" containerName="mariadb-account-create-update" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.649590 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46" containerName="mariadb-account-create-update" Jan 28 07:09:26 crc kubenswrapper[4776]: E0128 07:09:26.649610 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17a5a2d-48f9-4ebb-a103-1c9e92d82f41" containerName="mariadb-database-create" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.649617 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17a5a2d-48f9-4ebb-a103-1c9e92d82f41" containerName="mariadb-database-create" Jan 28 07:09:26 crc kubenswrapper[4776]: E0128 07:09:26.649634 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5180ed1-0d82-4c44-aed4-3f3a5b34af93" containerName="horizon-log" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.649641 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5180ed1-0d82-4c44-aed4-3f3a5b34af93" containerName="horizon-log" Jan 28 07:09:26 crc kubenswrapper[4776]: E0128 07:09:26.649660 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7649c00f-e457-48f3-8d5c-28ca197fb663" containerName="mariadb-account-create-update" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.649668 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7649c00f-e457-48f3-8d5c-28ca197fb663" containerName="mariadb-account-create-update" Jan 28 07:09:26 crc kubenswrapper[4776]: E0128 07:09:26.649680 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14baefd8-d74d-44bd-83a9-64f6d8a71fbe" containerName="mariadb-database-create" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.649689 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="14baefd8-d74d-44bd-83a9-64f6d8a71fbe" containerName="mariadb-database-create" Jan 28 07:09:26 crc kubenswrapper[4776]: E0128 07:09:26.649709 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5180ed1-0d82-4c44-aed4-3f3a5b34af93" containerName="horizon" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.649717 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5180ed1-0d82-4c44-aed4-3f3a5b34af93" containerName="horizon" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.649917 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46" containerName="mariadb-account-create-update" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.649932 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5180ed1-0d82-4c44-aed4-3f3a5b34af93" containerName="horizon" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.649939 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5180ed1-0d82-4c44-aed4-3f3a5b34af93" containerName="horizon-log" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.649952 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8aa4600-018d-4394-a873-92af1c70b5ba" containerName="mariadb-database-create" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.649965 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="435a9994-8725-46d9-ad26-4cf3179a61e9" containerName="mariadb-account-create-update" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.649977 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="14baefd8-d74d-44bd-83a9-64f6d8a71fbe" containerName="mariadb-database-create" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.649987 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="d17a5a2d-48f9-4ebb-a103-1c9e92d82f41" containerName="mariadb-database-create" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.650011 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="7649c00f-e457-48f3-8d5c-28ca197fb663" containerName="mariadb-account-create-update" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.652227 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vpps7" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.661278 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.661604 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.661462 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tzvqv" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.663003 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vpps7"] Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.714301 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-config-data\") pod \"nova-cell0-conductor-db-sync-vpps7\" (UID: \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\") " pod="openstack/nova-cell0-conductor-db-sync-vpps7" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.714367 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-scripts\") pod \"nova-cell0-conductor-db-sync-vpps7\" (UID: \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\") " pod="openstack/nova-cell0-conductor-db-sync-vpps7" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.714569 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2dm5\" (UniqueName: \"kubernetes.io/projected/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-kube-api-access-r2dm5\") pod \"nova-cell0-conductor-db-sync-vpps7\" (UID: \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\") " pod="openstack/nova-cell0-conductor-db-sync-vpps7" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.714758 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vpps7\" (UID: \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\") " pod="openstack/nova-cell0-conductor-db-sync-vpps7" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.816369 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vpps7\" (UID: \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\") " pod="openstack/nova-cell0-conductor-db-sync-vpps7" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.816438 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-config-data\") pod \"nova-cell0-conductor-db-sync-vpps7\" (UID: \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\") " pod="openstack/nova-cell0-conductor-db-sync-vpps7" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.816464 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-scripts\") pod \"nova-cell0-conductor-db-sync-vpps7\" (UID: \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\") " pod="openstack/nova-cell0-conductor-db-sync-vpps7" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.816525 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2dm5\" (UniqueName: \"kubernetes.io/projected/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-kube-api-access-r2dm5\") pod \"nova-cell0-conductor-db-sync-vpps7\" (UID: \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\") " pod="openstack/nova-cell0-conductor-db-sync-vpps7" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.825194 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vpps7\" (UID: \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\") " pod="openstack/nova-cell0-conductor-db-sync-vpps7" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.830287 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-config-data\") pod \"nova-cell0-conductor-db-sync-vpps7\" (UID: \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\") " pod="openstack/nova-cell0-conductor-db-sync-vpps7" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.831947 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-scripts\") pod \"nova-cell0-conductor-db-sync-vpps7\" (UID: \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\") " pod="openstack/nova-cell0-conductor-db-sync-vpps7" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.832567 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2dm5\" (UniqueName: \"kubernetes.io/projected/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-kube-api-access-r2dm5\") pod \"nova-cell0-conductor-db-sync-vpps7\" (UID: \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\") " pod="openstack/nova-cell0-conductor-db-sync-vpps7" Jan 28 07:09:26 crc kubenswrapper[4776]: I0128 07:09:26.977262 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vpps7" Jan 28 07:09:27 crc kubenswrapper[4776]: W0128 07:09:27.484130 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod745dbbf2_ccdc_4a8f_878b_9208eb693cc5.slice/crio-8ecf81a6f669d0a4af07a46062d5cb41a7e668c3b1d8754173c6fe233d4d70f9 WatchSource:0}: Error finding container 8ecf81a6f669d0a4af07a46062d5cb41a7e668c3b1d8754173c6fe233d4d70f9: Status 404 returned error can't find the container with id 8ecf81a6f669d0a4af07a46062d5cb41a7e668c3b1d8754173c6fe233d4d70f9 Jan 28 07:09:27 crc kubenswrapper[4776]: I0128 07:09:27.496503 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vpps7"] Jan 28 07:09:28 crc kubenswrapper[4776]: I0128 07:09:28.208364 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vpps7" event={"ID":"745dbbf2-ccdc-4a8f-878b-9208eb693cc5","Type":"ContainerStarted","Data":"8ecf81a6f669d0a4af07a46062d5cb41a7e668c3b1d8754173c6fe233d4d70f9"} Jan 28 07:09:29 crc kubenswrapper[4776]: I0128 07:09:29.661398 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 07:09:29 crc kubenswrapper[4776]: I0128 07:09:29.661677 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b7d5d2aa-a358-4412-8668-444842b2bdc5" containerName="glance-log" containerID="cri-o://b5e56bd0be33523a64dd7f889e0d34604e29671e8c1d5c4ba1b566729e090a4d" gracePeriod=30 Jan 28 07:09:29 crc kubenswrapper[4776]: I0128 07:09:29.661841 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b7d5d2aa-a358-4412-8668-444842b2bdc5" containerName="glance-httpd" containerID="cri-o://33cf145a97753123cfad01797a3b02d87459dfeda5a994cb916ae2b60c05f7d7" gracePeriod=30 Jan 28 07:09:30 crc kubenswrapper[4776]: I0128 07:09:30.234235 4776 generic.go:334] "Generic (PLEG): container finished" podID="b7d5d2aa-a358-4412-8668-444842b2bdc5" containerID="b5e56bd0be33523a64dd7f889e0d34604e29671e8c1d5c4ba1b566729e090a4d" exitCode=143 Jan 28 07:09:30 crc kubenswrapper[4776]: I0128 07:09:30.234480 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b7d5d2aa-a358-4412-8668-444842b2bdc5","Type":"ContainerDied","Data":"b5e56bd0be33523a64dd7f889e0d34604e29671e8c1d5c4ba1b566729e090a4d"} Jan 28 07:09:30 crc kubenswrapper[4776]: I0128 07:09:30.551505 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 28 07:09:30 crc kubenswrapper[4776]: I0128 07:09:30.552055 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="2ba7ddaa-338f-46aa-9609-609740f34cb7" containerName="watcher-decision-engine" containerID="cri-o://95dc261b2ec54a28ab7c6c625e9157540aa47b5bb30b13271909fe9e3c1c06b1" gracePeriod=30 Jan 28 07:09:31 crc kubenswrapper[4776]: I0128 07:09:31.625709 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 07:09:31 crc kubenswrapper[4776]: I0128 07:09:31.626108 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" containerName="glance-log" containerID="cri-o://4c93005f1832027045b278fb28689f32912bd457d90ef31e7cacbe5bd6cd8b14" gracePeriod=30 Jan 28 07:09:31 crc kubenswrapper[4776]: I0128 07:09:31.626443 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" containerName="glance-httpd" containerID="cri-o://d0d276b338d842190888892da687520404f2f083ff0b36044a1f5eb6dda63bb6" gracePeriod=30 Jan 28 07:09:32 crc kubenswrapper[4776]: I0128 07:09:32.304656 4776 generic.go:334] "Generic (PLEG): container finished" podID="bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" containerID="4c93005f1832027045b278fb28689f32912bd457d90ef31e7cacbe5bd6cd8b14" exitCode=143 Jan 28 07:09:32 crc kubenswrapper[4776]: I0128 07:09:32.304866 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a","Type":"ContainerDied","Data":"4c93005f1832027045b278fb28689f32912bd457d90ef31e7cacbe5bd6cd8b14"} Jan 28 07:09:32 crc kubenswrapper[4776]: E0128 07:09:32.524455 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="95dc261b2ec54a28ab7c6c625e9157540aa47b5bb30b13271909fe9e3c1c06b1" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Jan 28 07:09:32 crc kubenswrapper[4776]: E0128 07:09:32.526398 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="95dc261b2ec54a28ab7c6c625e9157540aa47b5bb30b13271909fe9e3c1c06b1" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Jan 28 07:09:32 crc kubenswrapper[4776]: E0128 07:09:32.531114 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="95dc261b2ec54a28ab7c6c625e9157540aa47b5bb30b13271909fe9e3c1c06b1" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Jan 28 07:09:32 crc kubenswrapper[4776]: E0128 07:09:32.531166 4776 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-decision-engine-0" podUID="2ba7ddaa-338f-46aa-9609-609740f34cb7" containerName="watcher-decision-engine" Jan 28 07:09:33 crc kubenswrapper[4776]: I0128 07:09:33.315855 4776 generic.go:334] "Generic (PLEG): container finished" podID="b7d5d2aa-a358-4412-8668-444842b2bdc5" containerID="33cf145a97753123cfad01797a3b02d87459dfeda5a994cb916ae2b60c05f7d7" exitCode=0 Jan 28 07:09:33 crc kubenswrapper[4776]: I0128 07:09:33.321690 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b7d5d2aa-a358-4412-8668-444842b2bdc5","Type":"ContainerDied","Data":"33cf145a97753123cfad01797a3b02d87459dfeda5a994cb916ae2b60c05f7d7"} Jan 28 07:09:35 crc kubenswrapper[4776]: I0128 07:09:35.347022 4776 generic.go:334] "Generic (PLEG): container finished" podID="2ba7ddaa-338f-46aa-9609-609740f34cb7" containerID="95dc261b2ec54a28ab7c6c625e9157540aa47b5bb30b13271909fe9e3c1c06b1" exitCode=0 Jan 28 07:09:35 crc kubenswrapper[4776]: I0128 07:09:35.347080 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"2ba7ddaa-338f-46aa-9609-609740f34cb7","Type":"ContainerDied","Data":"95dc261b2ec54a28ab7c6c625e9157540aa47b5bb30b13271909fe9e3c1c06b1"} Jan 28 07:09:35 crc kubenswrapper[4776]: I0128 07:09:35.349529 4776 generic.go:334] "Generic (PLEG): container finished" podID="bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" containerID="d0d276b338d842190888892da687520404f2f083ff0b36044a1f5eb6dda63bb6" exitCode=0 Jan 28 07:09:35 crc kubenswrapper[4776]: I0128 07:09:35.349615 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a","Type":"ContainerDied","Data":"d0d276b338d842190888892da687520404f2f083ff0b36044a1f5eb6dda63bb6"} Jan 28 07:09:35 crc kubenswrapper[4776]: I0128 07:09:35.962433 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.127179 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-combined-ca-bundle\") pod \"2ba7ddaa-338f-46aa-9609-609740f34cb7\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.127232 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-config-data\") pod \"2ba7ddaa-338f-46aa-9609-609740f34cb7\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.127274 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ba7ddaa-338f-46aa-9609-609740f34cb7-logs\") pod \"2ba7ddaa-338f-46aa-9609-609740f34cb7\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.127358 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g98tl\" (UniqueName: \"kubernetes.io/projected/2ba7ddaa-338f-46aa-9609-609740f34cb7-kube-api-access-g98tl\") pod \"2ba7ddaa-338f-46aa-9609-609740f34cb7\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.127410 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-custom-prometheus-ca\") pod \"2ba7ddaa-338f-46aa-9609-609740f34cb7\" (UID: \"2ba7ddaa-338f-46aa-9609-609740f34cb7\") " Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.128440 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ba7ddaa-338f-46aa-9609-609740f34cb7-logs" (OuterVolumeSpecName: "logs") pod "2ba7ddaa-338f-46aa-9609-609740f34cb7" (UID: "2ba7ddaa-338f-46aa-9609-609740f34cb7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.132754 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ba7ddaa-338f-46aa-9609-609740f34cb7-kube-api-access-g98tl" (OuterVolumeSpecName: "kube-api-access-g98tl") pod "2ba7ddaa-338f-46aa-9609-609740f34cb7" (UID: "2ba7ddaa-338f-46aa-9609-609740f34cb7"). InnerVolumeSpecName "kube-api-access-g98tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.163824 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "2ba7ddaa-338f-46aa-9609-609740f34cb7" (UID: "2ba7ddaa-338f-46aa-9609-609740f34cb7"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.211804 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-config-data" (OuterVolumeSpecName: "config-data") pod "2ba7ddaa-338f-46aa-9609-609740f34cb7" (UID: "2ba7ddaa-338f-46aa-9609-609740f34cb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.216105 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ba7ddaa-338f-46aa-9609-609740f34cb7" (UID: "2ba7ddaa-338f-46aa-9609-609740f34cb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.241756 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.241791 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.241801 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ba7ddaa-338f-46aa-9609-609740f34cb7-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.241809 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g98tl\" (UniqueName: \"kubernetes.io/projected/2ba7ddaa-338f-46aa-9609-609740f34cb7-kube-api-access-g98tl\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.241819 4776 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2ba7ddaa-338f-46aa-9609-609740f34cb7-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.294780 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.365911 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vpps7" event={"ID":"745dbbf2-ccdc-4a8f-878b-9208eb693cc5","Type":"ContainerStarted","Data":"dd68b836b756665baf0195ac99aaa3369ce42363de65b984002275c975ab30e1"} Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.368435 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b7d5d2aa-a358-4412-8668-444842b2bdc5","Type":"ContainerDied","Data":"517a8fd914c0b490dae9bdddcce183872f3246515cfb13e0a30accc6d643d7d9"} Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.368595 4776 scope.go:117] "RemoveContainer" containerID="33cf145a97753123cfad01797a3b02d87459dfeda5a994cb916ae2b60c05f7d7" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.368744 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.370314 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"2ba7ddaa-338f-46aa-9609-609740f34cb7","Type":"ContainerDied","Data":"d49e48162f51c6b7ea8eb71889bc72686552dd9ab8bf8ace2c33c1fd5718bf12"} Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.370387 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.386392 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-vpps7" podStartSLOduration=1.973483095 podStartE2EDuration="10.386375472s" podCreationTimestamp="2026-01-28 07:09:26 +0000 UTC" firstStartedPulling="2026-01-28 07:09:27.486782493 +0000 UTC m=+1138.902442653" lastFinishedPulling="2026-01-28 07:09:35.89967486 +0000 UTC m=+1147.315335030" observedRunningTime="2026-01-28 07:09:36.380961975 +0000 UTC m=+1147.796622135" watchObservedRunningTime="2026-01-28 07:09:36.386375472 +0000 UTC m=+1147.802035632" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.398037 4776 scope.go:117] "RemoveContainer" containerID="b5e56bd0be33523a64dd7f889e0d34604e29671e8c1d5c4ba1b566729e090a4d" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.428056 4776 scope.go:117] "RemoveContainer" containerID="95dc261b2ec54a28ab7c6c625e9157540aa47b5bb30b13271909fe9e3c1c06b1" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.433202 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.443788 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.444377 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-public-tls-certs\") pod \"b7d5d2aa-a358-4412-8668-444842b2bdc5\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.444452 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-scripts\") pod \"b7d5d2aa-a358-4412-8668-444842b2bdc5\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.444501 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"b7d5d2aa-a358-4412-8668-444842b2bdc5\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.444600 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7d5d2aa-a358-4412-8668-444842b2bdc5-httpd-run\") pod \"b7d5d2aa-a358-4412-8668-444842b2bdc5\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.444737 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-combined-ca-bundle\") pod \"b7d5d2aa-a358-4412-8668-444842b2bdc5\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.444772 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-config-data\") pod \"b7d5d2aa-a358-4412-8668-444842b2bdc5\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.444821 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d5d2aa-a358-4412-8668-444842b2bdc5-logs\") pod \"b7d5d2aa-a358-4412-8668-444842b2bdc5\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.444894 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgb7g\" (UniqueName: \"kubernetes.io/projected/b7d5d2aa-a358-4412-8668-444842b2bdc5-kube-api-access-fgb7g\") pod \"b7d5d2aa-a358-4412-8668-444842b2bdc5\" (UID: \"b7d5d2aa-a358-4412-8668-444842b2bdc5\") " Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.455353 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7d5d2aa-a358-4412-8668-444842b2bdc5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b7d5d2aa-a358-4412-8668-444842b2bdc5" (UID: "b7d5d2aa-a358-4412-8668-444842b2bdc5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.456941 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7d5d2aa-a358-4412-8668-444842b2bdc5-logs" (OuterVolumeSpecName: "logs") pod "b7d5d2aa-a358-4412-8668-444842b2bdc5" (UID: "b7d5d2aa-a358-4412-8668-444842b2bdc5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.458338 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.458503 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-scripts" (OuterVolumeSpecName: "scripts") pod "b7d5d2aa-a358-4412-8668-444842b2bdc5" (UID: "b7d5d2aa-a358-4412-8668-444842b2bdc5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:36 crc kubenswrapper[4776]: E0128 07:09:36.459715 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d5d2aa-a358-4412-8668-444842b2bdc5" containerName="glance-log" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.459760 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d5d2aa-a358-4412-8668-444842b2bdc5" containerName="glance-log" Jan 28 07:09:36 crc kubenswrapper[4776]: E0128 07:09:36.459802 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d5d2aa-a358-4412-8668-444842b2bdc5" containerName="glance-httpd" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.459809 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d5d2aa-a358-4412-8668-444842b2bdc5" containerName="glance-httpd" Jan 28 07:09:36 crc kubenswrapper[4776]: E0128 07:09:36.459837 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ba7ddaa-338f-46aa-9609-609740f34cb7" containerName="watcher-decision-engine" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.459848 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba7ddaa-338f-46aa-9609-609740f34cb7" containerName="watcher-decision-engine" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.460017 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d5d2aa-a358-4412-8668-444842b2bdc5" containerName="glance-log" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.460042 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d5d2aa-a358-4412-8668-444842b2bdc5" containerName="glance-httpd" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.460061 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ba7ddaa-338f-46aa-9609-609740f34cb7" containerName="watcher-decision-engine" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.460655 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.461590 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d5d2aa-a358-4412-8668-444842b2bdc5-kube-api-access-fgb7g" (OuterVolumeSpecName: "kube-api-access-fgb7g") pod "b7d5d2aa-a358-4412-8668-444842b2bdc5" (UID: "b7d5d2aa-a358-4412-8668-444842b2bdc5"). InnerVolumeSpecName "kube-api-access-fgb7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.463011 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.467809 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "b7d5d2aa-a358-4412-8668-444842b2bdc5" (UID: "b7d5d2aa-a358-4412-8668-444842b2bdc5"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.479869 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.527909 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7d5d2aa-a358-4412-8668-444842b2bdc5" (UID: "b7d5d2aa-a358-4412-8668-444842b2bdc5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.546730 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2cf5aeb-349c-47d3-989b-d56e91f7ff51-logs\") pod \"watcher-decision-engine-0\" (UID: \"c2cf5aeb-349c-47d3-989b-d56e91f7ff51\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.546862 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cf5aeb-349c-47d3-989b-d56e91f7ff51-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"c2cf5aeb-349c-47d3-989b-d56e91f7ff51\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.546943 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db22h\" (UniqueName: \"kubernetes.io/projected/c2cf5aeb-349c-47d3-989b-d56e91f7ff51-kube-api-access-db22h\") pod \"watcher-decision-engine-0\" (UID: \"c2cf5aeb-349c-47d3-989b-d56e91f7ff51\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.547049 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2cf5aeb-349c-47d3-989b-d56e91f7ff51-config-data\") pod \"watcher-decision-engine-0\" (UID: \"c2cf5aeb-349c-47d3-989b-d56e91f7ff51\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.547075 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c2cf5aeb-349c-47d3-989b-d56e91f7ff51-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"c2cf5aeb-349c-47d3-989b-d56e91f7ff51\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.547138 4776 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7d5d2aa-a358-4412-8668-444842b2bdc5-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.547151 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.547162 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d5d2aa-a358-4412-8668-444842b2bdc5-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.547171 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgb7g\" (UniqueName: \"kubernetes.io/projected/b7d5d2aa-a358-4412-8668-444842b2bdc5-kube-api-access-fgb7g\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.547204 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.547223 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.565178 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-config-data" (OuterVolumeSpecName: "config-data") pod "b7d5d2aa-a358-4412-8668-444842b2bdc5" (UID: "b7d5d2aa-a358-4412-8668-444842b2bdc5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.567856 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b7d5d2aa-a358-4412-8668-444842b2bdc5" (UID: "b7d5d2aa-a358-4412-8668-444842b2bdc5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.574190 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.648494 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db22h\" (UniqueName: \"kubernetes.io/projected/c2cf5aeb-349c-47d3-989b-d56e91f7ff51-kube-api-access-db22h\") pod \"watcher-decision-engine-0\" (UID: \"c2cf5aeb-349c-47d3-989b-d56e91f7ff51\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.648589 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2cf5aeb-349c-47d3-989b-d56e91f7ff51-config-data\") pod \"watcher-decision-engine-0\" (UID: \"c2cf5aeb-349c-47d3-989b-d56e91f7ff51\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.648614 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c2cf5aeb-349c-47d3-989b-d56e91f7ff51-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"c2cf5aeb-349c-47d3-989b-d56e91f7ff51\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.648668 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2cf5aeb-349c-47d3-989b-d56e91f7ff51-logs\") pod \"watcher-decision-engine-0\" (UID: \"c2cf5aeb-349c-47d3-989b-d56e91f7ff51\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.648687 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cf5aeb-349c-47d3-989b-d56e91f7ff51-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"c2cf5aeb-349c-47d3-989b-d56e91f7ff51\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.648746 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.648758 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.648768 4776 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d5d2aa-a358-4412-8668-444842b2bdc5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.649106 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2cf5aeb-349c-47d3-989b-d56e91f7ff51-logs\") pod \"watcher-decision-engine-0\" (UID: \"c2cf5aeb-349c-47d3-989b-d56e91f7ff51\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.651718 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c2cf5aeb-349c-47d3-989b-d56e91f7ff51-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"c2cf5aeb-349c-47d3-989b-d56e91f7ff51\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.652901 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cf5aeb-349c-47d3-989b-d56e91f7ff51-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"c2cf5aeb-349c-47d3-989b-d56e91f7ff51\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.653596 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2cf5aeb-349c-47d3-989b-d56e91f7ff51-config-data\") pod \"watcher-decision-engine-0\" (UID: \"c2cf5aeb-349c-47d3-989b-d56e91f7ff51\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.665286 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db22h\" (UniqueName: \"kubernetes.io/projected/c2cf5aeb-349c-47d3-989b-d56e91f7ff51-kube-api-access-db22h\") pod \"watcher-decision-engine-0\" (UID: \"c2cf5aeb-349c-47d3-989b-d56e91f7ff51\") " pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.716846 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.724712 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.743450 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.746266 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.751638 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.751819 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.755132 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.852025 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.852082 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-config-data\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.852102 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-logs\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.852190 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.852221 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-scripts\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.852234 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.852259 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckc2s\" (UniqueName: \"kubernetes.io/projected/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-kube-api-access-ckc2s\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.852284 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.946186 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.954124 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.954183 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-config-data\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.954204 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-logs\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.954296 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.954329 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-scripts\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.954344 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.954370 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckc2s\" (UniqueName: \"kubernetes.io/projected/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-kube-api-access-ckc2s\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.954397 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.954526 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.955019 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.958739 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-logs\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.960500 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.961273 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-scripts\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.962320 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.965326 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-config-data\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.977112 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckc2s\" (UniqueName: \"kubernetes.io/projected/78f1d6b5-48e7-4ad3-8066-acb3faf83f73-kube-api-access-ckc2s\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:36 crc kubenswrapper[4776]: I0128 07:09:36.981796 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"78f1d6b5-48e7-4ad3-8066-acb3faf83f73\") " pod="openstack/glance-default-external-api-0" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.063434 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.323683 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ba7ddaa-338f-46aa-9609-609740f34cb7" path="/var/lib/kubelet/pods/2ba7ddaa-338f-46aa-9609-609740f34cb7/volumes" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.325143 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d5d2aa-a358-4412-8668-444842b2bdc5" path="/var/lib/kubelet/pods/b7d5d2aa-a358-4412-8668-444842b2bdc5/volumes" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.400877 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.417580 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.466261 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-httpd-run\") pod \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.466310 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdzmq\" (UniqueName: \"kubernetes.io/projected/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-kube-api-access-vdzmq\") pod \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.466349 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-internal-tls-certs\") pod \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.466423 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-config-data\") pod \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.466494 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-combined-ca-bundle\") pod \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.466510 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.466547 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-scripts\") pod \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.466614 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-logs\") pod \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\" (UID: \"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a\") " Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.467115 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" (UID: "bab3dce9-ed00-49d3-9576-99d9a7cbdd9a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.467376 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-logs" (OuterVolumeSpecName: "logs") pod "bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" (UID: "bab3dce9-ed00-49d3-9576-99d9a7cbdd9a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.482097 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" (UID: "bab3dce9-ed00-49d3-9576-99d9a7cbdd9a"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.485072 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-scripts" (OuterVolumeSpecName: "scripts") pod "bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" (UID: "bab3dce9-ed00-49d3-9576-99d9a7cbdd9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.492921 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-kube-api-access-vdzmq" (OuterVolumeSpecName: "kube-api-access-vdzmq") pod "bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" (UID: "bab3dce9-ed00-49d3-9576-99d9a7cbdd9a"). InnerVolumeSpecName "kube-api-access-vdzmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.535737 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-config-data" (OuterVolumeSpecName: "config-data") pod "bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" (UID: "bab3dce9-ed00-49d3-9576-99d9a7cbdd9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.561839 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" (UID: "bab3dce9-ed00-49d3-9576-99d9a7cbdd9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.570518 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.570604 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.570653 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.570672 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.570700 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.570719 4776 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.570737 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdzmq\" (UniqueName: \"kubernetes.io/projected/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-kube-api-access-vdzmq\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.581329 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" (UID: "bab3dce9-ed00-49d3-9576-99d9a7cbdd9a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.616787 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.663888 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.672659 4776 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:37 crc kubenswrapper[4776]: I0128 07:09:37.672686 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.419514 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bab3dce9-ed00-49d3-9576-99d9a7cbdd9a","Type":"ContainerDied","Data":"87d8b403c99f650b6ac90ddb2b994de9ac7d7b557d181ff16409c39fe90da324"} Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.419894 4776 scope.go:117] "RemoveContainer" containerID="d0d276b338d842190888892da687520404f2f083ff0b36044a1f5eb6dda63bb6" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.419575 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.436450 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"78f1d6b5-48e7-4ad3-8066-acb3faf83f73","Type":"ContainerStarted","Data":"6e5331e6f10ff042ef70b7b4f779a8ca95f86d379625003014c891e701f82cd4"} Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.436488 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"78f1d6b5-48e7-4ad3-8066-acb3faf83f73","Type":"ContainerStarted","Data":"4dc8a78800b65a5e32f1847279fa42a7bb154e19d8fb17a00d81dd7ebbc0598c"} Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.440706 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c2cf5aeb-349c-47d3-989b-d56e91f7ff51","Type":"ContainerStarted","Data":"11c06abad0799c72380510c5e2a82b9f4e5b26fe526d5be2213cb7191683d555"} Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.440761 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c2cf5aeb-349c-47d3-989b-d56e91f7ff51","Type":"ContainerStarted","Data":"9627a936b16f1a59d49b0a946eb84e433ad83fbd86ec16ad7b9b009e4b66817b"} Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.462817 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.466155 4776 scope.go:117] "RemoveContainer" containerID="4c93005f1832027045b278fb28689f32912bd457d90ef31e7cacbe5bd6cd8b14" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.473032 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.497639 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.497620117 podStartE2EDuration="2.497620117s" podCreationTimestamp="2026-01-28 07:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:09:38.489179098 +0000 UTC m=+1149.904839258" watchObservedRunningTime="2026-01-28 07:09:38.497620117 +0000 UTC m=+1149.913280267" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.507732 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 07:09:38 crc kubenswrapper[4776]: E0128 07:09:38.508208 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" containerName="glance-httpd" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.508225 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" containerName="glance-httpd" Jan 28 07:09:38 crc kubenswrapper[4776]: E0128 07:09:38.508245 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" containerName="glance-log" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.508251 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" containerName="glance-log" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.508458 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" containerName="glance-httpd" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.508480 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" containerName="glance-log" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.509493 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.515047 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.515349 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.521868 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.588968 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38e0faa6-840c-4e44-ad4a-16d42f83e194-scripts\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.589037 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.589328 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38e0faa6-840c-4e44-ad4a-16d42f83e194-config-data\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.589468 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/38e0faa6-840c-4e44-ad4a-16d42f83e194-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.589507 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38e0faa6-840c-4e44-ad4a-16d42f83e194-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.589713 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38e0faa6-840c-4e44-ad4a-16d42f83e194-logs\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.589852 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfbld\" (UniqueName: \"kubernetes.io/projected/38e0faa6-840c-4e44-ad4a-16d42f83e194-kube-api-access-gfbld\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.589925 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38e0faa6-840c-4e44-ad4a-16d42f83e194-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.691909 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38e0faa6-840c-4e44-ad4a-16d42f83e194-config-data\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.691980 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/38e0faa6-840c-4e44-ad4a-16d42f83e194-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.692013 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38e0faa6-840c-4e44-ad4a-16d42f83e194-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.695597 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38e0faa6-840c-4e44-ad4a-16d42f83e194-logs\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.695993 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/38e0faa6-840c-4e44-ad4a-16d42f83e194-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.696041 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38e0faa6-840c-4e44-ad4a-16d42f83e194-logs\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.696162 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfbld\" (UniqueName: \"kubernetes.io/projected/38e0faa6-840c-4e44-ad4a-16d42f83e194-kube-api-access-gfbld\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.696236 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38e0faa6-840c-4e44-ad4a-16d42f83e194-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.697579 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38e0faa6-840c-4e44-ad4a-16d42f83e194-scripts\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.698241 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.698689 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.698773 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38e0faa6-840c-4e44-ad4a-16d42f83e194-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.703886 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38e0faa6-840c-4e44-ad4a-16d42f83e194-scripts\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.704341 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38e0faa6-840c-4e44-ad4a-16d42f83e194-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.704521 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38e0faa6-840c-4e44-ad4a-16d42f83e194-config-data\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.724898 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfbld\" (UniqueName: \"kubernetes.io/projected/38e0faa6-840c-4e44-ad4a-16d42f83e194-kube-api-access-gfbld\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.735596 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"38e0faa6-840c-4e44-ad4a-16d42f83e194\") " pod="openstack/glance-default-internal-api-0" Jan 28 07:09:38 crc kubenswrapper[4776]: I0128 07:09:38.865967 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 07:09:39 crc kubenswrapper[4776]: I0128 07:09:39.316639 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bab3dce9-ed00-49d3-9576-99d9a7cbdd9a" path="/var/lib/kubelet/pods/bab3dce9-ed00-49d3-9576-99d9a7cbdd9a/volumes" Jan 28 07:09:39 crc kubenswrapper[4776]: I0128 07:09:39.439806 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 07:09:39 crc kubenswrapper[4776]: W0128 07:09:39.443483 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38e0faa6_840c_4e44_ad4a_16d42f83e194.slice/crio-4892d95b983390ad13f4be15697fc0a7774cb6e2e0d91df306f6223519b5c4b5 WatchSource:0}: Error finding container 4892d95b983390ad13f4be15697fc0a7774cb6e2e0d91df306f6223519b5c4b5: Status 404 returned error can't find the container with id 4892d95b983390ad13f4be15697fc0a7774cb6e2e0d91df306f6223519b5c4b5 Jan 28 07:09:39 crc kubenswrapper[4776]: I0128 07:09:39.456593 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"78f1d6b5-48e7-4ad3-8066-acb3faf83f73","Type":"ContainerStarted","Data":"378eeb8fc0b1eb2ce31b81f5e413b70eb1204f905e54b756ae25f20a41c584c1"} Jan 28 07:09:39 crc kubenswrapper[4776]: I0128 07:09:39.478403 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.47838102 podStartE2EDuration="3.47838102s" podCreationTimestamp="2026-01-28 07:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:09:39.471854002 +0000 UTC m=+1150.887514162" watchObservedRunningTime="2026-01-28 07:09:39.47838102 +0000 UTC m=+1150.894041200" Jan 28 07:09:40 crc kubenswrapper[4776]: I0128 07:09:40.474805 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"38e0faa6-840c-4e44-ad4a-16d42f83e194","Type":"ContainerStarted","Data":"0a9ad99dbc7bdd6fb8adb460b37fd3256a991b12043439ad60f4a1bad127bf11"} Jan 28 07:09:40 crc kubenswrapper[4776]: I0128 07:09:40.475434 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"38e0faa6-840c-4e44-ad4a-16d42f83e194","Type":"ContainerStarted","Data":"4892d95b983390ad13f4be15697fc0a7774cb6e2e0d91df306f6223519b5c4b5"} Jan 28 07:09:41 crc kubenswrapper[4776]: I0128 07:09:41.487300 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"38e0faa6-840c-4e44-ad4a-16d42f83e194","Type":"ContainerStarted","Data":"7e6fc7844e355850071160a2a231993a135a3b0381a10b07e05a3d59c2830087"} Jan 28 07:09:46 crc kubenswrapper[4776]: I0128 07:09:46.947283 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 28 07:09:46 crc kubenswrapper[4776]: I0128 07:09:46.982575 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Jan 28 07:09:47 crc kubenswrapper[4776]: I0128 07:09:47.023695 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.023670101 podStartE2EDuration="9.023670101s" podCreationTimestamp="2026-01-28 07:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:09:41.508746612 +0000 UTC m=+1152.924406772" watchObservedRunningTime="2026-01-28 07:09:47.023670101 +0000 UTC m=+1158.439330281" Jan 28 07:09:47 crc kubenswrapper[4776]: I0128 07:09:47.064991 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 28 07:09:47 crc kubenswrapper[4776]: I0128 07:09:47.065051 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 28 07:09:47 crc kubenswrapper[4776]: I0128 07:09:47.100765 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 28 07:09:47 crc kubenswrapper[4776]: I0128 07:09:47.110096 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 28 07:09:47 crc kubenswrapper[4776]: I0128 07:09:47.552183 4776 generic.go:334] "Generic (PLEG): container finished" podID="745dbbf2-ccdc-4a8f-878b-9208eb693cc5" containerID="dd68b836b756665baf0195ac99aaa3369ce42363de65b984002275c975ab30e1" exitCode=0 Jan 28 07:09:47 crc kubenswrapper[4776]: I0128 07:09:47.552285 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vpps7" event={"ID":"745dbbf2-ccdc-4a8f-878b-9208eb693cc5","Type":"ContainerDied","Data":"dd68b836b756665baf0195ac99aaa3369ce42363de65b984002275c975ab30e1"} Jan 28 07:09:47 crc kubenswrapper[4776]: I0128 07:09:47.552645 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Jan 28 07:09:47 crc kubenswrapper[4776]: I0128 07:09:47.552893 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 28 07:09:47 crc kubenswrapper[4776]: I0128 07:09:47.552930 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 28 07:09:47 crc kubenswrapper[4776]: I0128 07:09:47.600861 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Jan 28 07:09:47 crc kubenswrapper[4776]: I0128 07:09:47.619264 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 28 07:09:48 crc kubenswrapper[4776]: I0128 07:09:48.866819 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 28 07:09:48 crc kubenswrapper[4776]: I0128 07:09:48.867141 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 28 07:09:48 crc kubenswrapper[4776]: I0128 07:09:48.930840 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 28 07:09:48 crc kubenswrapper[4776]: I0128 07:09:48.931973 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 28 07:09:48 crc kubenswrapper[4776]: I0128 07:09:48.932062 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vpps7" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.021950 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-combined-ca-bundle\") pod \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\" (UID: \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\") " Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.022014 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-scripts\") pod \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\" (UID: \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\") " Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.022123 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-config-data\") pod \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\" (UID: \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\") " Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.022232 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2dm5\" (UniqueName: \"kubernetes.io/projected/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-kube-api-access-r2dm5\") pod \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\" (UID: \"745dbbf2-ccdc-4a8f-878b-9208eb693cc5\") " Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.031755 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-scripts" (OuterVolumeSpecName: "scripts") pod "745dbbf2-ccdc-4a8f-878b-9208eb693cc5" (UID: "745dbbf2-ccdc-4a8f-878b-9208eb693cc5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.053091 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-kube-api-access-r2dm5" (OuterVolumeSpecName: "kube-api-access-r2dm5") pod "745dbbf2-ccdc-4a8f-878b-9208eb693cc5" (UID: "745dbbf2-ccdc-4a8f-878b-9208eb693cc5"). InnerVolumeSpecName "kube-api-access-r2dm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.061053 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "745dbbf2-ccdc-4a8f-878b-9208eb693cc5" (UID: "745dbbf2-ccdc-4a8f-878b-9208eb693cc5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.073200 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-config-data" (OuterVolumeSpecName: "config-data") pod "745dbbf2-ccdc-4a8f-878b-9208eb693cc5" (UID: "745dbbf2-ccdc-4a8f-878b-9208eb693cc5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.124746 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.124783 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.124794 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.124802 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2dm5\" (UniqueName: \"kubernetes.io/projected/745dbbf2-ccdc-4a8f-878b-9208eb693cc5-kube-api-access-r2dm5\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.545645 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.555921 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.576176 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vpps7" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.576230 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vpps7" event={"ID":"745dbbf2-ccdc-4a8f-878b-9208eb693cc5","Type":"ContainerDied","Data":"8ecf81a6f669d0a4af07a46062d5cb41a7e668c3b1d8754173c6fe233d4d70f9"} Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.576277 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ecf81a6f669d0a4af07a46062d5cb41a7e668c3b1d8754173c6fe233d4d70f9" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.576324 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.576576 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.694626 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 07:09:49 crc kubenswrapper[4776]: E0128 07:09:49.695020 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="745dbbf2-ccdc-4a8f-878b-9208eb693cc5" containerName="nova-cell0-conductor-db-sync" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.695036 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="745dbbf2-ccdc-4a8f-878b-9208eb693cc5" containerName="nova-cell0-conductor-db-sync" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.695216 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="745dbbf2-ccdc-4a8f-878b-9208eb693cc5" containerName="nova-cell0-conductor-db-sync" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.695819 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.706229 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tzvqv" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.706448 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.714498 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.839772 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/091b6025-b6c8-4c2b-81d7-7b25aeaef620-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"091b6025-b6c8-4c2b-81d7-7b25aeaef620\") " pod="openstack/nova-cell0-conductor-0" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.839892 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/091b6025-b6c8-4c2b-81d7-7b25aeaef620-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"091b6025-b6c8-4c2b-81d7-7b25aeaef620\") " pod="openstack/nova-cell0-conductor-0" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.839944 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7628b\" (UniqueName: \"kubernetes.io/projected/091b6025-b6c8-4c2b-81d7-7b25aeaef620-kube-api-access-7628b\") pod \"nova-cell0-conductor-0\" (UID: \"091b6025-b6c8-4c2b-81d7-7b25aeaef620\") " pod="openstack/nova-cell0-conductor-0" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.944625 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7628b\" (UniqueName: \"kubernetes.io/projected/091b6025-b6c8-4c2b-81d7-7b25aeaef620-kube-api-access-7628b\") pod \"nova-cell0-conductor-0\" (UID: \"091b6025-b6c8-4c2b-81d7-7b25aeaef620\") " pod="openstack/nova-cell0-conductor-0" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.944715 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/091b6025-b6c8-4c2b-81d7-7b25aeaef620-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"091b6025-b6c8-4c2b-81d7-7b25aeaef620\") " pod="openstack/nova-cell0-conductor-0" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.944792 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/091b6025-b6c8-4c2b-81d7-7b25aeaef620-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"091b6025-b6c8-4c2b-81d7-7b25aeaef620\") " pod="openstack/nova-cell0-conductor-0" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.950186 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/091b6025-b6c8-4c2b-81d7-7b25aeaef620-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"091b6025-b6c8-4c2b-81d7-7b25aeaef620\") " pod="openstack/nova-cell0-conductor-0" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.961300 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/091b6025-b6c8-4c2b-81d7-7b25aeaef620-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"091b6025-b6c8-4c2b-81d7-7b25aeaef620\") " pod="openstack/nova-cell0-conductor-0" Jan 28 07:09:49 crc kubenswrapper[4776]: I0128 07:09:49.971227 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7628b\" (UniqueName: \"kubernetes.io/projected/091b6025-b6c8-4c2b-81d7-7b25aeaef620-kube-api-access-7628b\") pod \"nova-cell0-conductor-0\" (UID: \"091b6025-b6c8-4c2b-81d7-7b25aeaef620\") " pod="openstack/nova-cell0-conductor-0" Jan 28 07:09:50 crc kubenswrapper[4776]: I0128 07:09:50.026955 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 07:09:50 crc kubenswrapper[4776]: W0128 07:09:50.631406 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod091b6025_b6c8_4c2b_81d7_7b25aeaef620.slice/crio-ac0423fc36fd24bef9ac4d2689fd9336ee01365c43f43d0e3c0c609d5858454a WatchSource:0}: Error finding container ac0423fc36fd24bef9ac4d2689fd9336ee01365c43f43d0e3c0c609d5858454a: Status 404 returned error can't find the container with id ac0423fc36fd24bef9ac4d2689fd9336ee01365c43f43d0e3c0c609d5858454a Jan 28 07:09:50 crc kubenswrapper[4776]: I0128 07:09:50.634013 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 07:09:51 crc kubenswrapper[4776]: I0128 07:09:51.601382 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 07:09:51 crc kubenswrapper[4776]: I0128 07:09:51.601687 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 07:09:51 crc kubenswrapper[4776]: I0128 07:09:51.601369 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"091b6025-b6c8-4c2b-81d7-7b25aeaef620","Type":"ContainerStarted","Data":"2956690a303dec7ef72d83afd44486fd9224cc2c11cad38a4b9eb5eb91cc3826"} Jan 28 07:09:51 crc kubenswrapper[4776]: I0128 07:09:51.601828 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"091b6025-b6c8-4c2b-81d7-7b25aeaef620","Type":"ContainerStarted","Data":"ac0423fc36fd24bef9ac4d2689fd9336ee01365c43f43d0e3c0c609d5858454a"} Jan 28 07:09:51 crc kubenswrapper[4776]: I0128 07:09:51.602229 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 28 07:09:51 crc kubenswrapper[4776]: I0128 07:09:51.637647 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.63762331 podStartE2EDuration="2.63762331s" podCreationTimestamp="2026-01-28 07:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:09:51.629622253 +0000 UTC m=+1163.045282423" watchObservedRunningTime="2026-01-28 07:09:51.63762331 +0000 UTC m=+1163.053283470" Jan 28 07:09:51 crc kubenswrapper[4776]: I0128 07:09:51.770735 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 28 07:09:51 crc kubenswrapper[4776]: I0128 07:09:51.790060 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.601242 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.626808 4776 generic.go:334] "Generic (PLEG): container finished" podID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerID="0bd90e69b57794f14737f05121f26c89c2c13fb658a3f921f6d097b00df684aa" exitCode=137 Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.626886 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.626922 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148c5aca-4eba-4912-8d65-cb8e4820948b","Type":"ContainerDied","Data":"0bd90e69b57794f14737f05121f26c89c2c13fb658a3f921f6d097b00df684aa"} Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.627596 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148c5aca-4eba-4912-8d65-cb8e4820948b","Type":"ContainerDied","Data":"9d86522de39d71d513eef9233e51f150be8d24a3c9fdc02b02b6f6a68a21f3e4"} Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.627620 4776 scope.go:117] "RemoveContainer" containerID="0bd90e69b57794f14737f05121f26c89c2c13fb658a3f921f6d097b00df684aa" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.665086 4776 scope.go:117] "RemoveContainer" containerID="14360c6e93ecb99c455232c35e2c1d9624ab8bcc9268e8d8feb87bc9aac52f0d" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.706220 4776 scope.go:117] "RemoveContainer" containerID="bde59a5cd27ad9e13ddd8ab2a06ec4ca29d0389e0c72f387cb7bfb0c981d90af" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.712286 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148c5aca-4eba-4912-8d65-cb8e4820948b-run-httpd\") pod \"148c5aca-4eba-4912-8d65-cb8e4820948b\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.712588 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148c5aca-4eba-4912-8d65-cb8e4820948b-log-httpd\") pod \"148c5aca-4eba-4912-8d65-cb8e4820948b\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.712742 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-sg-core-conf-yaml\") pod \"148c5aca-4eba-4912-8d65-cb8e4820948b\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.712839 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-config-data\") pod \"148c5aca-4eba-4912-8d65-cb8e4820948b\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.713090 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/148c5aca-4eba-4912-8d65-cb8e4820948b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "148c5aca-4eba-4912-8d65-cb8e4820948b" (UID: "148c5aca-4eba-4912-8d65-cb8e4820948b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.713490 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/148c5aca-4eba-4912-8d65-cb8e4820948b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "148c5aca-4eba-4912-8d65-cb8e4820948b" (UID: "148c5aca-4eba-4912-8d65-cb8e4820948b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.713764 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-scripts\") pod \"148c5aca-4eba-4912-8d65-cb8e4820948b\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.713971 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-combined-ca-bundle\") pod \"148c5aca-4eba-4912-8d65-cb8e4820948b\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.714170 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h26zj\" (UniqueName: \"kubernetes.io/projected/148c5aca-4eba-4912-8d65-cb8e4820948b-kube-api-access-h26zj\") pod \"148c5aca-4eba-4912-8d65-cb8e4820948b\" (UID: \"148c5aca-4eba-4912-8d65-cb8e4820948b\") " Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.715162 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148c5aca-4eba-4912-8d65-cb8e4820948b-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.715317 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148c5aca-4eba-4912-8d65-cb8e4820948b-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.720725 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-scripts" (OuterVolumeSpecName: "scripts") pod "148c5aca-4eba-4912-8d65-cb8e4820948b" (UID: "148c5aca-4eba-4912-8d65-cb8e4820948b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.720903 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/148c5aca-4eba-4912-8d65-cb8e4820948b-kube-api-access-h26zj" (OuterVolumeSpecName: "kube-api-access-h26zj") pod "148c5aca-4eba-4912-8d65-cb8e4820948b" (UID: "148c5aca-4eba-4912-8d65-cb8e4820948b"). InnerVolumeSpecName "kube-api-access-h26zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.734832 4776 scope.go:117] "RemoveContainer" containerID="3824ca3c9fe1ad07df53349ab8aa84b9d74c2b08b0e25250aa37f45ddda014d4" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.742864 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "148c5aca-4eba-4912-8d65-cb8e4820948b" (UID: "148c5aca-4eba-4912-8d65-cb8e4820948b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.792100 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "148c5aca-4eba-4912-8d65-cb8e4820948b" (UID: "148c5aca-4eba-4912-8d65-cb8e4820948b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.817458 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.817813 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.817825 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h26zj\" (UniqueName: \"kubernetes.io/projected/148c5aca-4eba-4912-8d65-cb8e4820948b-kube-api-access-h26zj\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.817833 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.842251 4776 scope.go:117] "RemoveContainer" containerID="0bd90e69b57794f14737f05121f26c89c2c13fb658a3f921f6d097b00df684aa" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.845750 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-config-data" (OuterVolumeSpecName: "config-data") pod "148c5aca-4eba-4912-8d65-cb8e4820948b" (UID: "148c5aca-4eba-4912-8d65-cb8e4820948b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:09:52 crc kubenswrapper[4776]: E0128 07:09:52.845762 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bd90e69b57794f14737f05121f26c89c2c13fb658a3f921f6d097b00df684aa\": container with ID starting with 0bd90e69b57794f14737f05121f26c89c2c13fb658a3f921f6d097b00df684aa not found: ID does not exist" containerID="0bd90e69b57794f14737f05121f26c89c2c13fb658a3f921f6d097b00df684aa" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.845811 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bd90e69b57794f14737f05121f26c89c2c13fb658a3f921f6d097b00df684aa"} err="failed to get container status \"0bd90e69b57794f14737f05121f26c89c2c13fb658a3f921f6d097b00df684aa\": rpc error: code = NotFound desc = could not find container \"0bd90e69b57794f14737f05121f26c89c2c13fb658a3f921f6d097b00df684aa\": container with ID starting with 0bd90e69b57794f14737f05121f26c89c2c13fb658a3f921f6d097b00df684aa not found: ID does not exist" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.845845 4776 scope.go:117] "RemoveContainer" containerID="14360c6e93ecb99c455232c35e2c1d9624ab8bcc9268e8d8feb87bc9aac52f0d" Jan 28 07:09:52 crc kubenswrapper[4776]: E0128 07:09:52.846588 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14360c6e93ecb99c455232c35e2c1d9624ab8bcc9268e8d8feb87bc9aac52f0d\": container with ID starting with 14360c6e93ecb99c455232c35e2c1d9624ab8bcc9268e8d8feb87bc9aac52f0d not found: ID does not exist" containerID="14360c6e93ecb99c455232c35e2c1d9624ab8bcc9268e8d8feb87bc9aac52f0d" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.846704 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14360c6e93ecb99c455232c35e2c1d9624ab8bcc9268e8d8feb87bc9aac52f0d"} err="failed to get container status \"14360c6e93ecb99c455232c35e2c1d9624ab8bcc9268e8d8feb87bc9aac52f0d\": rpc error: code = NotFound desc = could not find container \"14360c6e93ecb99c455232c35e2c1d9624ab8bcc9268e8d8feb87bc9aac52f0d\": container with ID starting with 14360c6e93ecb99c455232c35e2c1d9624ab8bcc9268e8d8feb87bc9aac52f0d not found: ID does not exist" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.846728 4776 scope.go:117] "RemoveContainer" containerID="bde59a5cd27ad9e13ddd8ab2a06ec4ca29d0389e0c72f387cb7bfb0c981d90af" Jan 28 07:09:52 crc kubenswrapper[4776]: E0128 07:09:52.847448 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde59a5cd27ad9e13ddd8ab2a06ec4ca29d0389e0c72f387cb7bfb0c981d90af\": container with ID starting with bde59a5cd27ad9e13ddd8ab2a06ec4ca29d0389e0c72f387cb7bfb0c981d90af not found: ID does not exist" containerID="bde59a5cd27ad9e13ddd8ab2a06ec4ca29d0389e0c72f387cb7bfb0c981d90af" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.847516 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde59a5cd27ad9e13ddd8ab2a06ec4ca29d0389e0c72f387cb7bfb0c981d90af"} err="failed to get container status \"bde59a5cd27ad9e13ddd8ab2a06ec4ca29d0389e0c72f387cb7bfb0c981d90af\": rpc error: code = NotFound desc = could not find container \"bde59a5cd27ad9e13ddd8ab2a06ec4ca29d0389e0c72f387cb7bfb0c981d90af\": container with ID starting with bde59a5cd27ad9e13ddd8ab2a06ec4ca29d0389e0c72f387cb7bfb0c981d90af not found: ID does not exist" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.847573 4776 scope.go:117] "RemoveContainer" containerID="3824ca3c9fe1ad07df53349ab8aa84b9d74c2b08b0e25250aa37f45ddda014d4" Jan 28 07:09:52 crc kubenswrapper[4776]: E0128 07:09:52.847888 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3824ca3c9fe1ad07df53349ab8aa84b9d74c2b08b0e25250aa37f45ddda014d4\": container with ID starting with 3824ca3c9fe1ad07df53349ab8aa84b9d74c2b08b0e25250aa37f45ddda014d4 not found: ID does not exist" containerID="3824ca3c9fe1ad07df53349ab8aa84b9d74c2b08b0e25250aa37f45ddda014d4" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.847946 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3824ca3c9fe1ad07df53349ab8aa84b9d74c2b08b0e25250aa37f45ddda014d4"} err="failed to get container status \"3824ca3c9fe1ad07df53349ab8aa84b9d74c2b08b0e25250aa37f45ddda014d4\": rpc error: code = NotFound desc = could not find container \"3824ca3c9fe1ad07df53349ab8aa84b9d74c2b08b0e25250aa37f45ddda014d4\": container with ID starting with 3824ca3c9fe1ad07df53349ab8aa84b9d74c2b08b0e25250aa37f45ddda014d4 not found: ID does not exist" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.919394 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148c5aca-4eba-4912-8d65-cb8e4820948b-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.974610 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:09:52 crc kubenswrapper[4776]: I0128 07:09:52.986317 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.002010 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:09:53 crc kubenswrapper[4776]: E0128 07:09:53.002523 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerName="ceilometer-central-agent" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.002629 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerName="ceilometer-central-agent" Jan 28 07:09:53 crc kubenswrapper[4776]: E0128 07:09:53.002695 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerName="sg-core" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.002757 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerName="sg-core" Jan 28 07:09:53 crc kubenswrapper[4776]: E0128 07:09:53.002829 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerName="ceilometer-notification-agent" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.002879 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerName="ceilometer-notification-agent" Jan 28 07:09:53 crc kubenswrapper[4776]: E0128 07:09:53.002945 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerName="proxy-httpd" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.002994 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerName="proxy-httpd" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.003219 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerName="ceilometer-notification-agent" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.003288 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerName="ceilometer-central-agent" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.003347 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerName="sg-core" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.003406 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" containerName="proxy-httpd" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.007153 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.011172 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.011404 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.022511 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.122470 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-run-httpd\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.122704 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-config-data\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.122877 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prgn4\" (UniqueName: \"kubernetes.io/projected/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-kube-api-access-prgn4\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.123006 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-log-httpd\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.123123 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.123279 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.123358 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-scripts\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.225136 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.225440 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-scripts\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.225613 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-run-httpd\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.225703 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-config-data\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.225817 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prgn4\" (UniqueName: \"kubernetes.io/projected/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-kube-api-access-prgn4\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.225911 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-log-httpd\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.225987 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.226874 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-run-httpd\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.227506 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-log-httpd\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.230004 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.230583 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-scripts\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.231815 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.234508 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-config-data\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.248499 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prgn4\" (UniqueName: \"kubernetes.io/projected/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-kube-api-access-prgn4\") pod \"ceilometer-0\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.319318 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="148c5aca-4eba-4912-8d65-cb8e4820948b" path="/var/lib/kubelet/pods/148c5aca-4eba-4912-8d65-cb8e4820948b/volumes" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.376301 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:09:53 crc kubenswrapper[4776]: I0128 07:09:53.928065 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:09:53 crc kubenswrapper[4776]: W0128 07:09:53.930431 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0e63f1d_9670_4be6_ae47_f59ef27fdc2e.slice/crio-5dee53da67b3f636aada3cf42b77077c5c60edaa010b5e9d9b5ae7ac99403f77 WatchSource:0}: Error finding container 5dee53da67b3f636aada3cf42b77077c5c60edaa010b5e9d9b5ae7ac99403f77: Status 404 returned error can't find the container with id 5dee53da67b3f636aada3cf42b77077c5c60edaa010b5e9d9b5ae7ac99403f77 Jan 28 07:09:54 crc kubenswrapper[4776]: I0128 07:09:54.649479 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e","Type":"ContainerStarted","Data":"5dee53da67b3f636aada3cf42b77077c5c60edaa010b5e9d9b5ae7ac99403f77"} Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.071997 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.533016 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-7qvb2"] Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.534759 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7qvb2" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.539205 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.540194 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.552613 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7qvb2"] Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.577059 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-scripts\") pod \"nova-cell0-cell-mapping-7qvb2\" (UID: \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\") " pod="openstack/nova-cell0-cell-mapping-7qvb2" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.577158 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc8rl\" (UniqueName: \"kubernetes.io/projected/bf66e818-91d4-4a50-b10e-b40f0dc8754d-kube-api-access-gc8rl\") pod \"nova-cell0-cell-mapping-7qvb2\" (UID: \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\") " pod="openstack/nova-cell0-cell-mapping-7qvb2" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.577323 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-config-data\") pod \"nova-cell0-cell-mapping-7qvb2\" (UID: \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\") " pod="openstack/nova-cell0-cell-mapping-7qvb2" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.577357 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7qvb2\" (UID: \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\") " pod="openstack/nova-cell0-cell-mapping-7qvb2" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.660032 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e","Type":"ContainerStarted","Data":"cfeeabf3a3202d8f5d081cb49d65dc7fb5054ef80e9717964236e58462c7c86f"} Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.660083 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e","Type":"ContainerStarted","Data":"ccbac3e77006da25eafd54b1a36432127861753cd17b0a5e7c18593559bc9bcf"} Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.679587 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-config-data\") pod \"nova-cell0-cell-mapping-7qvb2\" (UID: \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\") " pod="openstack/nova-cell0-cell-mapping-7qvb2" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.679648 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7qvb2\" (UID: \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\") " pod="openstack/nova-cell0-cell-mapping-7qvb2" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.679684 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-scripts\") pod \"nova-cell0-cell-mapping-7qvb2\" (UID: \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\") " pod="openstack/nova-cell0-cell-mapping-7qvb2" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.679763 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc8rl\" (UniqueName: \"kubernetes.io/projected/bf66e818-91d4-4a50-b10e-b40f0dc8754d-kube-api-access-gc8rl\") pod \"nova-cell0-cell-mapping-7qvb2\" (UID: \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\") " pod="openstack/nova-cell0-cell-mapping-7qvb2" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.692320 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-config-data\") pod \"nova-cell0-cell-mapping-7qvb2\" (UID: \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\") " pod="openstack/nova-cell0-cell-mapping-7qvb2" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.693187 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7qvb2\" (UID: \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\") " pod="openstack/nova-cell0-cell-mapping-7qvb2" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.695017 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-scripts\") pod \"nova-cell0-cell-mapping-7qvb2\" (UID: \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\") " pod="openstack/nova-cell0-cell-mapping-7qvb2" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.716981 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc8rl\" (UniqueName: \"kubernetes.io/projected/bf66e818-91d4-4a50-b10e-b40f0dc8754d-kube-api-access-gc8rl\") pod \"nova-cell0-cell-mapping-7qvb2\" (UID: \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\") " pod="openstack/nova-cell0-cell-mapping-7qvb2" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.753886 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.758673 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.760789 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.770586 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.846809 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.859352 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.867324 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.878785 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.880046 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.887051 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjq7g\" (UniqueName: \"kubernetes.io/projected/92e5558c-b4d0-4688-bbbe-89da202f6c56-kube-api-access-gjq7g\") pod \"nova-api-0\" (UID: \"92e5558c-b4d0-4688-bbbe-89da202f6c56\") " pod="openstack/nova-api-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.887105 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e5558c-b4d0-4688-bbbe-89da202f6c56-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"92e5558c-b4d0-4688-bbbe-89da202f6c56\") " pod="openstack/nova-api-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.887140 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92e5558c-b4d0-4688-bbbe-89da202f6c56-logs\") pod \"nova-api-0\" (UID: \"92e5558c-b4d0-4688-bbbe-89da202f6c56\") " pod="openstack/nova-api-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.887213 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92e5558c-b4d0-4688-bbbe-89da202f6c56-config-data\") pod \"nova-api-0\" (UID: \"92e5558c-b4d0-4688-bbbe-89da202f6c56\") " pod="openstack/nova-api-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.887253 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.892642 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7qvb2" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.907604 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.949269 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.968919 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.976665 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.985621 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.988184 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.991255 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92e5558c-b4d0-4688-bbbe-89da202f6c56-logs\") pod \"nova-api-0\" (UID: \"92e5558c-b4d0-4688-bbbe-89da202f6c56\") " pod="openstack/nova-api-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.991414 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l42r\" (UniqueName: \"kubernetes.io/projected/13487701-e252-4b0c-8991-df2864bcbde5-kube-api-access-2l42r\") pod \"nova-metadata-0\" (UID: \"13487701-e252-4b0c-8991-df2864bcbde5\") " pod="openstack/nova-metadata-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.991477 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3095c4-70f9-4dd6-8797-76f60a334942-config-data\") pod \"nova-scheduler-0\" (UID: \"ad3095c4-70f9-4dd6-8797-76f60a334942\") " pod="openstack/nova-scheduler-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.991526 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92e5558c-b4d0-4688-bbbe-89da202f6c56-config-data\") pod \"nova-api-0\" (UID: \"92e5558c-b4d0-4688-bbbe-89da202f6c56\") " pod="openstack/nova-api-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.991609 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13487701-e252-4b0c-8991-df2864bcbde5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"13487701-e252-4b0c-8991-df2864bcbde5\") " pod="openstack/nova-metadata-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.992610 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92e5558c-b4d0-4688-bbbe-89da202f6c56-logs\") pod \"nova-api-0\" (UID: \"92e5558c-b4d0-4688-bbbe-89da202f6c56\") " pod="openstack/nova-api-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.995472 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13487701-e252-4b0c-8991-df2864bcbde5-config-data\") pod \"nova-metadata-0\" (UID: \"13487701-e252-4b0c-8991-df2864bcbde5\") " pod="openstack/nova-metadata-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.995591 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3095c4-70f9-4dd6-8797-76f60a334942-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ad3095c4-70f9-4dd6-8797-76f60a334942\") " pod="openstack/nova-scheduler-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.995622 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13487701-e252-4b0c-8991-df2864bcbde5-logs\") pod \"nova-metadata-0\" (UID: \"13487701-e252-4b0c-8991-df2864bcbde5\") " pod="openstack/nova-metadata-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.995719 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzwmn\" (UniqueName: \"kubernetes.io/projected/ad3095c4-70f9-4dd6-8797-76f60a334942-kube-api-access-wzwmn\") pod \"nova-scheduler-0\" (UID: \"ad3095c4-70f9-4dd6-8797-76f60a334942\") " pod="openstack/nova-scheduler-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.995791 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjq7g\" (UniqueName: \"kubernetes.io/projected/92e5558c-b4d0-4688-bbbe-89da202f6c56-kube-api-access-gjq7g\") pod \"nova-api-0\" (UID: \"92e5558c-b4d0-4688-bbbe-89da202f6c56\") " pod="openstack/nova-api-0" Jan 28 07:09:55 crc kubenswrapper[4776]: I0128 07:09:55.995825 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e5558c-b4d0-4688-bbbe-89da202f6c56-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"92e5558c-b4d0-4688-bbbe-89da202f6c56\") " pod="openstack/nova-api-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.000295 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92e5558c-b4d0-4688-bbbe-89da202f6c56-config-data\") pod \"nova-api-0\" (UID: \"92e5558c-b4d0-4688-bbbe-89da202f6c56\") " pod="openstack/nova-api-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.003127 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e5558c-b4d0-4688-bbbe-89da202f6c56-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"92e5558c-b4d0-4688-bbbe-89da202f6c56\") " pod="openstack/nova-api-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.029099 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjq7g\" (UniqueName: \"kubernetes.io/projected/92e5558c-b4d0-4688-bbbe-89da202f6c56-kube-api-access-gjq7g\") pod \"nova-api-0\" (UID: \"92e5558c-b4d0-4688-bbbe-89da202f6c56\") " pod="openstack/nova-api-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.076644 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ckcck"] Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.078254 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.083663 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ckcck"] Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.099441 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.105371 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhmcf\" (UniqueName: \"kubernetes.io/projected/b496787c-32af-4ac1-b15d-814ee1abc744-kube-api-access-lhmcf\") pod \"nova-cell1-novncproxy-0\" (UID: \"b496787c-32af-4ac1-b15d-814ee1abc744\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.105568 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l42r\" (UniqueName: \"kubernetes.io/projected/13487701-e252-4b0c-8991-df2864bcbde5-kube-api-access-2l42r\") pod \"nova-metadata-0\" (UID: \"13487701-e252-4b0c-8991-df2864bcbde5\") " pod="openstack/nova-metadata-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.105631 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3095c4-70f9-4dd6-8797-76f60a334942-config-data\") pod \"nova-scheduler-0\" (UID: \"ad3095c4-70f9-4dd6-8797-76f60a334942\") " pod="openstack/nova-scheduler-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.105664 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b496787c-32af-4ac1-b15d-814ee1abc744-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b496787c-32af-4ac1-b15d-814ee1abc744\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.105758 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b496787c-32af-4ac1-b15d-814ee1abc744-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b496787c-32af-4ac1-b15d-814ee1abc744\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.105795 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13487701-e252-4b0c-8991-df2864bcbde5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"13487701-e252-4b0c-8991-df2864bcbde5\") " pod="openstack/nova-metadata-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.105820 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13487701-e252-4b0c-8991-df2864bcbde5-config-data\") pod \"nova-metadata-0\" (UID: \"13487701-e252-4b0c-8991-df2864bcbde5\") " pod="openstack/nova-metadata-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.105851 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3095c4-70f9-4dd6-8797-76f60a334942-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ad3095c4-70f9-4dd6-8797-76f60a334942\") " pod="openstack/nova-scheduler-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.105872 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13487701-e252-4b0c-8991-df2864bcbde5-logs\") pod \"nova-metadata-0\" (UID: \"13487701-e252-4b0c-8991-df2864bcbde5\") " pod="openstack/nova-metadata-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.105941 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzwmn\" (UniqueName: \"kubernetes.io/projected/ad3095c4-70f9-4dd6-8797-76f60a334942-kube-api-access-wzwmn\") pod \"nova-scheduler-0\" (UID: \"ad3095c4-70f9-4dd6-8797-76f60a334942\") " pod="openstack/nova-scheduler-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.109742 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13487701-e252-4b0c-8991-df2864bcbde5-logs\") pod \"nova-metadata-0\" (UID: \"13487701-e252-4b0c-8991-df2864bcbde5\") " pod="openstack/nova-metadata-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.125033 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3095c4-70f9-4dd6-8797-76f60a334942-config-data\") pod \"nova-scheduler-0\" (UID: \"ad3095c4-70f9-4dd6-8797-76f60a334942\") " pod="openstack/nova-scheduler-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.131411 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l42r\" (UniqueName: \"kubernetes.io/projected/13487701-e252-4b0c-8991-df2864bcbde5-kube-api-access-2l42r\") pod \"nova-metadata-0\" (UID: \"13487701-e252-4b0c-8991-df2864bcbde5\") " pod="openstack/nova-metadata-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.132127 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13487701-e252-4b0c-8991-df2864bcbde5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"13487701-e252-4b0c-8991-df2864bcbde5\") " pod="openstack/nova-metadata-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.138268 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3095c4-70f9-4dd6-8797-76f60a334942-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ad3095c4-70f9-4dd6-8797-76f60a334942\") " pod="openstack/nova-scheduler-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.150350 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13487701-e252-4b0c-8991-df2864bcbde5-config-data\") pod \"nova-metadata-0\" (UID: \"13487701-e252-4b0c-8991-df2864bcbde5\") " pod="openstack/nova-metadata-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.151748 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzwmn\" (UniqueName: \"kubernetes.io/projected/ad3095c4-70f9-4dd6-8797-76f60a334942-kube-api-access-wzwmn\") pod \"nova-scheduler-0\" (UID: \"ad3095c4-70f9-4dd6-8797-76f60a334942\") " pod="openstack/nova-scheduler-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.207183 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.216900 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.226827 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.226879 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhmcf\" (UniqueName: \"kubernetes.io/projected/b496787c-32af-4ac1-b15d-814ee1abc744-kube-api-access-lhmcf\") pod \"nova-cell1-novncproxy-0\" (UID: \"b496787c-32af-4ac1-b15d-814ee1abc744\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.227190 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-config\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.227310 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-dns-svc\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.227395 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b496787c-32af-4ac1-b15d-814ee1abc744-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b496787c-32af-4ac1-b15d-814ee1abc744\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.227478 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.227564 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b496787c-32af-4ac1-b15d-814ee1abc744-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b496787c-32af-4ac1-b15d-814ee1abc744\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.227609 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc86m\" (UniqueName: \"kubernetes.io/projected/af39015e-a0f9-4ebd-b6c5-865f3081b2da-kube-api-access-tc86m\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.234534 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b496787c-32af-4ac1-b15d-814ee1abc744-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b496787c-32af-4ac1-b15d-814ee1abc744\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.243226 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b496787c-32af-4ac1-b15d-814ee1abc744-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b496787c-32af-4ac1-b15d-814ee1abc744\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.244029 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.330809 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-dns-svc\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.330879 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.330918 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc86m\" (UniqueName: \"kubernetes.io/projected/af39015e-a0f9-4ebd-b6c5-865f3081b2da-kube-api-access-tc86m\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.330990 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.331022 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.331096 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-config\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.334053 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-dns-svc\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.334243 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.334902 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.352123 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-config\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.369961 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.404424 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc86m\" (UniqueName: \"kubernetes.io/projected/af39015e-a0f9-4ebd-b6c5-865f3081b2da-kube-api-access-tc86m\") pod \"dnsmasq-dns-757b4f8459-ckcck\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.411732 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhmcf\" (UniqueName: \"kubernetes.io/projected/b496787c-32af-4ac1-b15d-814ee1abc744-kube-api-access-lhmcf\") pod \"nova-cell1-novncproxy-0\" (UID: \"b496787c-32af-4ac1-b15d-814ee1abc744\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.431265 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.450224 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.716414 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7qvb2"] Jan 28 07:09:56 crc kubenswrapper[4776]: I0128 07:09:56.923428 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.052430 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ww8wt"] Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.054077 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ww8wt" Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.062932 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.062966 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.085090 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ww8wt"] Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.119244 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.187397 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ww8wt\" (UID: \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\") " pod="openstack/nova-cell1-conductor-db-sync-ww8wt" Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.187467 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-scripts\") pod \"nova-cell1-conductor-db-sync-ww8wt\" (UID: \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\") " pod="openstack/nova-cell1-conductor-db-sync-ww8wt" Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.187535 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-config-data\") pod \"nova-cell1-conductor-db-sync-ww8wt\" (UID: \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\") " pod="openstack/nova-cell1-conductor-db-sync-ww8wt" Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.187679 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpfbg\" (UniqueName: \"kubernetes.io/projected/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-kube-api-access-xpfbg\") pod \"nova-cell1-conductor-db-sync-ww8wt\" (UID: \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\") " pod="openstack/nova-cell1-conductor-db-sync-ww8wt" Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.222980 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 07:09:57 crc kubenswrapper[4776]: W0128 07:09:57.234677 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad3095c4_70f9_4dd6_8797_76f60a334942.slice/crio-c2820e0567bfdc073fb3d0433d4e5c288821a48cf79c0339b2d72d4b5c1998f8 WatchSource:0}: Error finding container c2820e0567bfdc073fb3d0433d4e5c288821a48cf79c0339b2d72d4b5c1998f8: Status 404 returned error can't find the container with id c2820e0567bfdc073fb3d0433d4e5c288821a48cf79c0339b2d72d4b5c1998f8 Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.239085 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ckcck"] Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.296861 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpfbg\" (UniqueName: \"kubernetes.io/projected/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-kube-api-access-xpfbg\") pod \"nova-cell1-conductor-db-sync-ww8wt\" (UID: \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\") " pod="openstack/nova-cell1-conductor-db-sync-ww8wt" Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.296971 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ww8wt\" (UID: \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\") " pod="openstack/nova-cell1-conductor-db-sync-ww8wt" Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.297000 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-scripts\") pod \"nova-cell1-conductor-db-sync-ww8wt\" (UID: \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\") " pod="openstack/nova-cell1-conductor-db-sync-ww8wt" Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.297044 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-config-data\") pod \"nova-cell1-conductor-db-sync-ww8wt\" (UID: \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\") " pod="openstack/nova-cell1-conductor-db-sync-ww8wt" Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.302460 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-scripts\") pod \"nova-cell1-conductor-db-sync-ww8wt\" (UID: \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\") " pod="openstack/nova-cell1-conductor-db-sync-ww8wt" Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.302730 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-config-data\") pod \"nova-cell1-conductor-db-sync-ww8wt\" (UID: \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\") " pod="openstack/nova-cell1-conductor-db-sync-ww8wt" Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.303103 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ww8wt\" (UID: \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\") " pod="openstack/nova-cell1-conductor-db-sync-ww8wt" Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.321785 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpfbg\" (UniqueName: \"kubernetes.io/projected/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-kube-api-access-xpfbg\") pod \"nova-cell1-conductor-db-sync-ww8wt\" (UID: \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\") " pod="openstack/nova-cell1-conductor-db-sync-ww8wt" Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.399966 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ww8wt" Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.413763 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 07:09:57 crc kubenswrapper[4776]: W0128 07:09:57.451406 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb496787c_32af_4ac1_b15d_814ee1abc744.slice/crio-bf8ff259d4c2f217361c5c94f2736c5d8580d3ef93d92f3d765e39ee311768a1 WatchSource:0}: Error finding container bf8ff259d4c2f217361c5c94f2736c5d8580d3ef93d92f3d765e39ee311768a1: Status 404 returned error can't find the container with id bf8ff259d4c2f217361c5c94f2736c5d8580d3ef93d92f3d765e39ee311768a1 Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.722830 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13487701-e252-4b0c-8991-df2864bcbde5","Type":"ContainerStarted","Data":"b2db7fe78a4eb7f461a68c4b9eafc26c5279295976b99f0539596bf4c81c8d48"} Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.732919 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7qvb2" event={"ID":"bf66e818-91d4-4a50-b10e-b40f0dc8754d","Type":"ContainerStarted","Data":"72dd7aae6ab26716dba4392a2c43894b15d54a4848029158565f5e0c93aa7081"} Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.732964 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7qvb2" event={"ID":"bf66e818-91d4-4a50-b10e-b40f0dc8754d","Type":"ContainerStarted","Data":"0ef854ea48969e654ebc62720b207684db1fd265795279f8e354b5ae5d914b30"} Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.737678 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad3095c4-70f9-4dd6-8797-76f60a334942","Type":"ContainerStarted","Data":"c2820e0567bfdc073fb3d0433d4e5c288821a48cf79c0339b2d72d4b5c1998f8"} Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.740730 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b496787c-32af-4ac1-b15d-814ee1abc744","Type":"ContainerStarted","Data":"bf8ff259d4c2f217361c5c94f2736c5d8580d3ef93d92f3d765e39ee311768a1"} Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.769767 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-7qvb2" podStartSLOduration=2.769747899 podStartE2EDuration="2.769747899s" podCreationTimestamp="2026-01-28 07:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:09:57.766382938 +0000 UTC m=+1169.182043098" watchObservedRunningTime="2026-01-28 07:09:57.769747899 +0000 UTC m=+1169.185408059" Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.783098 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e","Type":"ContainerStarted","Data":"acdfad40b8c2e5c3e509613efb471c522c15ea63768d75ba8b38aceb4087f71c"} Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.793844 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"92e5558c-b4d0-4688-bbbe-89da202f6c56","Type":"ContainerStarted","Data":"72352250c3d7de55a7572ed0ca4300a8f90907041517aa237be868c0e7fb4c34"} Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.800509 4776 generic.go:334] "Generic (PLEG): container finished" podID="af39015e-a0f9-4ebd-b6c5-865f3081b2da" containerID="c4afce0457e6f296bd9294666fe4eb45f70f3f6fc5bb1d5483019aec01879d74" exitCode=0 Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.800582 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-ckcck" event={"ID":"af39015e-a0f9-4ebd-b6c5-865f3081b2da","Type":"ContainerDied","Data":"c4afce0457e6f296bd9294666fe4eb45f70f3f6fc5bb1d5483019aec01879d74"} Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.800612 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-ckcck" event={"ID":"af39015e-a0f9-4ebd-b6c5-865f3081b2da","Type":"ContainerStarted","Data":"5e634e178bf659875670ce7ea5a32b637a685031c89f9f55f9c28b98a803562c"} Jan 28 07:09:57 crc kubenswrapper[4776]: I0128 07:09:57.938000 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ww8wt"] Jan 28 07:09:57 crc kubenswrapper[4776]: W0128 07:09:57.977670 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21b0d9dc_05c6_417b_adaa_85d82bf95aeb.slice/crio-5567256e8b79e2a1cce6ec3bd1b089999b4d09a403ecee1402eaaf19287b5cbc WatchSource:0}: Error finding container 5567256e8b79e2a1cce6ec3bd1b089999b4d09a403ecee1402eaaf19287b5cbc: Status 404 returned error can't find the container with id 5567256e8b79e2a1cce6ec3bd1b089999b4d09a403ecee1402eaaf19287b5cbc Jan 28 07:09:58 crc kubenswrapper[4776]: I0128 07:09:58.850716 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ww8wt" event={"ID":"21b0d9dc-05c6-417b-adaa-85d82bf95aeb","Type":"ContainerStarted","Data":"22bc85da721c95a33e45c535ae41c5000cec07383fd28065763096bad879272d"} Jan 28 07:09:58 crc kubenswrapper[4776]: I0128 07:09:58.850975 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ww8wt" event={"ID":"21b0d9dc-05c6-417b-adaa-85d82bf95aeb","Type":"ContainerStarted","Data":"5567256e8b79e2a1cce6ec3bd1b089999b4d09a403ecee1402eaaf19287b5cbc"} Jan 28 07:09:58 crc kubenswrapper[4776]: I0128 07:09:58.863077 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-ckcck" event={"ID":"af39015e-a0f9-4ebd-b6c5-865f3081b2da","Type":"ContainerStarted","Data":"049705c2e9b72b399b18a24b9a710269ecb8af45f04cd1b326c3db1711c29dde"} Jan 28 07:09:58 crc kubenswrapper[4776]: I0128 07:09:58.866179 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:09:58 crc kubenswrapper[4776]: I0128 07:09:58.906799 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-ckcck" podStartSLOduration=3.9067798099999997 podStartE2EDuration="3.90677981s" podCreationTimestamp="2026-01-28 07:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:09:58.903709737 +0000 UTC m=+1170.319369897" watchObservedRunningTime="2026-01-28 07:09:58.90677981 +0000 UTC m=+1170.322439970" Jan 28 07:09:58 crc kubenswrapper[4776]: I0128 07:09:58.914472 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-ww8wt" podStartSLOduration=1.914454637 podStartE2EDuration="1.914454637s" podCreationTimestamp="2026-01-28 07:09:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:09:58.866346698 +0000 UTC m=+1170.282006858" watchObservedRunningTime="2026-01-28 07:09:58.914454637 +0000 UTC m=+1170.330114797" Jan 28 07:09:59 crc kubenswrapper[4776]: I0128 07:09:59.874667 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e","Type":"ContainerStarted","Data":"80a0e5d91eef9b50c58a931fe2e185b7dcfa3cf78580c301a919abbae7b3ec8d"} Jan 28 07:09:59 crc kubenswrapper[4776]: I0128 07:09:59.875482 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 07:09:59 crc kubenswrapper[4776]: I0128 07:09:59.909795 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 07:09:59 crc kubenswrapper[4776]: I0128 07:09:59.922429 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.297184153 podStartE2EDuration="7.92240998s" podCreationTimestamp="2026-01-28 07:09:52 +0000 UTC" firstStartedPulling="2026-01-28 07:09:53.933014316 +0000 UTC m=+1165.348674476" lastFinishedPulling="2026-01-28 07:09:58.558240143 +0000 UTC m=+1169.973900303" observedRunningTime="2026-01-28 07:09:59.909047739 +0000 UTC m=+1171.324707899" watchObservedRunningTime="2026-01-28 07:09:59.92240998 +0000 UTC m=+1171.338070140" Jan 28 07:09:59 crc kubenswrapper[4776]: I0128 07:09:59.956800 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 07:10:01 crc kubenswrapper[4776]: I0128 07:10:01.901862 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"92e5558c-b4d0-4688-bbbe-89da202f6c56","Type":"ContainerStarted","Data":"630011fce5539fafb71d6ab076243d692a62b66f8575f2a229b500205a3ab931"} Jan 28 07:10:01 crc kubenswrapper[4776]: I0128 07:10:01.902390 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"92e5558c-b4d0-4688-bbbe-89da202f6c56","Type":"ContainerStarted","Data":"35b2655d70ebcd2a7ec9d07be917635f57b04a5d592a537d2ab2320c9efc0517"} Jan 28 07:10:01 crc kubenswrapper[4776]: I0128 07:10:01.904212 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13487701-e252-4b0c-8991-df2864bcbde5","Type":"ContainerStarted","Data":"ae29791033ac7b5266aac2df203bfbb21af45f246276c8ea06f0b5ccdb3a99a4"} Jan 28 07:10:01 crc kubenswrapper[4776]: I0128 07:10:01.904252 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13487701-e252-4b0c-8991-df2864bcbde5","Type":"ContainerStarted","Data":"107a91f95a36a34c3052a52e3a3d60ca9bddbd6b024399d92e9fe1ea49578c95"} Jan 28 07:10:01 crc kubenswrapper[4776]: I0128 07:10:01.904350 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="13487701-e252-4b0c-8991-df2864bcbde5" containerName="nova-metadata-log" containerID="cri-o://107a91f95a36a34c3052a52e3a3d60ca9bddbd6b024399d92e9fe1ea49578c95" gracePeriod=30 Jan 28 07:10:01 crc kubenswrapper[4776]: I0128 07:10:01.904588 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="13487701-e252-4b0c-8991-df2864bcbde5" containerName="nova-metadata-metadata" containerID="cri-o://ae29791033ac7b5266aac2df203bfbb21af45f246276c8ea06f0b5ccdb3a99a4" gracePeriod=30 Jan 28 07:10:01 crc kubenswrapper[4776]: I0128 07:10:01.911485 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad3095c4-70f9-4dd6-8797-76f60a334942","Type":"ContainerStarted","Data":"cc5e9a9a71ce1a6f27052b931efa0ff4e8fd734419205ffc05a35b5109bb278c"} Jan 28 07:10:01 crc kubenswrapper[4776]: I0128 07:10:01.915292 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b496787c-32af-4ac1-b15d-814ee1abc744","Type":"ContainerStarted","Data":"9afeb46ec6221baa5d09d716d85530c311c743b5e7b9c4cd0c5b9366a3c36871"} Jan 28 07:10:01 crc kubenswrapper[4776]: I0128 07:10:01.915301 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b496787c-32af-4ac1-b15d-814ee1abc744" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9afeb46ec6221baa5d09d716d85530c311c743b5e7b9c4cd0c5b9366a3c36871" gracePeriod=30 Jan 28 07:10:01 crc kubenswrapper[4776]: I0128 07:10:01.930241 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.740325964 podStartE2EDuration="6.930214177s" podCreationTimestamp="2026-01-28 07:09:55 +0000 UTC" firstStartedPulling="2026-01-28 07:09:56.957607797 +0000 UTC m=+1168.373267957" lastFinishedPulling="2026-01-28 07:10:01.14749601 +0000 UTC m=+1172.563156170" observedRunningTime="2026-01-28 07:10:01.927625077 +0000 UTC m=+1173.343285237" watchObservedRunningTime="2026-01-28 07:10:01.930214177 +0000 UTC m=+1173.345874347" Jan 28 07:10:01 crc kubenswrapper[4776]: I0128 07:10:01.953592 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.259779698 podStartE2EDuration="6.953569138s" podCreationTimestamp="2026-01-28 07:09:55 +0000 UTC" firstStartedPulling="2026-01-28 07:09:57.46998523 +0000 UTC m=+1168.885645390" lastFinishedPulling="2026-01-28 07:10:01.16377467 +0000 UTC m=+1172.579434830" observedRunningTime="2026-01-28 07:10:01.94808447 +0000 UTC m=+1173.363744640" watchObservedRunningTime="2026-01-28 07:10:01.953569138 +0000 UTC m=+1173.369229298" Jan 28 07:10:01 crc kubenswrapper[4776]: I0128 07:10:01.963868 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.923089063 podStartE2EDuration="6.963850676s" podCreationTimestamp="2026-01-28 07:09:55 +0000 UTC" firstStartedPulling="2026-01-28 07:09:57.127050205 +0000 UTC m=+1168.542710365" lastFinishedPulling="2026-01-28 07:10:01.167811818 +0000 UTC m=+1172.583471978" observedRunningTime="2026-01-28 07:10:01.962010956 +0000 UTC m=+1173.377671116" watchObservedRunningTime="2026-01-28 07:10:01.963850676 +0000 UTC m=+1173.379510826" Jan 28 07:10:01 crc kubenswrapper[4776]: I0128 07:10:01.987197 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.075776948 podStartE2EDuration="6.987172196s" podCreationTimestamp="2026-01-28 07:09:55 +0000 UTC" firstStartedPulling="2026-01-28 07:09:57.239722499 +0000 UTC m=+1168.655382659" lastFinishedPulling="2026-01-28 07:10:01.151117747 +0000 UTC m=+1172.566777907" observedRunningTime="2026-01-28 07:10:01.977757851 +0000 UTC m=+1173.393418021" watchObservedRunningTime="2026-01-28 07:10:01.987172196 +0000 UTC m=+1173.402832366" Jan 28 07:10:02 crc kubenswrapper[4776]: I0128 07:10:02.929738 4776 generic.go:334] "Generic (PLEG): container finished" podID="13487701-e252-4b0c-8991-df2864bcbde5" containerID="107a91f95a36a34c3052a52e3a3d60ca9bddbd6b024399d92e9fe1ea49578c95" exitCode=143 Jan 28 07:10:02 crc kubenswrapper[4776]: I0128 07:10:02.930815 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13487701-e252-4b0c-8991-df2864bcbde5","Type":"ContainerDied","Data":"107a91f95a36a34c3052a52e3a3d60ca9bddbd6b024399d92e9fe1ea49578c95"} Jan 28 07:10:05 crc kubenswrapper[4776]: I0128 07:10:05.957001 4776 generic.go:334] "Generic (PLEG): container finished" podID="21b0d9dc-05c6-417b-adaa-85d82bf95aeb" containerID="22bc85da721c95a33e45c535ae41c5000cec07383fd28065763096bad879272d" exitCode=0 Jan 28 07:10:05 crc kubenswrapper[4776]: I0128 07:10:05.957075 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ww8wt" event={"ID":"21b0d9dc-05c6-417b-adaa-85d82bf95aeb","Type":"ContainerDied","Data":"22bc85da721c95a33e45c535ae41c5000cec07383fd28065763096bad879272d"} Jan 28 07:10:05 crc kubenswrapper[4776]: I0128 07:10:05.959322 4776 generic.go:334] "Generic (PLEG): container finished" podID="bf66e818-91d4-4a50-b10e-b40f0dc8754d" containerID="72dd7aae6ab26716dba4392a2c43894b15d54a4848029158565f5e0c93aa7081" exitCode=0 Jan 28 07:10:05 crc kubenswrapper[4776]: I0128 07:10:05.959356 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7qvb2" event={"ID":"bf66e818-91d4-4a50-b10e-b40f0dc8754d","Type":"ContainerDied","Data":"72dd7aae6ab26716dba4392a2c43894b15d54a4848029158565f5e0c93aa7081"} Jan 28 07:10:06 crc kubenswrapper[4776]: I0128 07:10:06.100347 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 07:10:06 crc kubenswrapper[4776]: I0128 07:10:06.100399 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 07:10:06 crc kubenswrapper[4776]: I0128 07:10:06.208415 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 07:10:06 crc kubenswrapper[4776]: I0128 07:10:06.208472 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 07:10:06 crc kubenswrapper[4776]: I0128 07:10:06.244825 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 28 07:10:06 crc kubenswrapper[4776]: I0128 07:10:06.244861 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 28 07:10:06 crc kubenswrapper[4776]: I0128 07:10:06.293666 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 28 07:10:06 crc kubenswrapper[4776]: I0128 07:10:06.432532 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:06 crc kubenswrapper[4776]: I0128 07:10:06.451762 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:10:06 crc kubenswrapper[4776]: I0128 07:10:06.552900 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jklmf"] Jan 28 07:10:06 crc kubenswrapper[4776]: I0128 07:10:06.553248 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" podUID="ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd" containerName="dnsmasq-dns" containerID="cri-o://94e09a9c1472e2917b8bbddf28d644810bebee8c18193b3624497336fae60209" gracePeriod=10 Jan 28 07:10:06 crc kubenswrapper[4776]: I0128 07:10:06.987065 4776 generic.go:334] "Generic (PLEG): container finished" podID="ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd" containerID="94e09a9c1472e2917b8bbddf28d644810bebee8c18193b3624497336fae60209" exitCode=0 Jan 28 07:10:06 crc kubenswrapper[4776]: I0128 07:10:06.987269 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" event={"ID":"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd","Type":"ContainerDied","Data":"94e09a9c1472e2917b8bbddf28d644810bebee8c18193b3624497336fae60209"} Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.065572 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.142050 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.189779 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="92e5558c-b4d0-4688-bbbe-89da202f6c56" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.189992 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="92e5558c-b4d0-4688-bbbe-89da202f6c56" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.241625 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-dns-svc\") pod \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.242503 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-ovsdbserver-sb\") pod \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.242933 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-config\") pod \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.243031 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd4lw\" (UniqueName: \"kubernetes.io/projected/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-kube-api-access-pd4lw\") pod \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.243370 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-dns-swift-storage-0\") pod \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.243449 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-ovsdbserver-nb\") pod \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\" (UID: \"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd\") " Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.256731 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-kube-api-access-pd4lw" (OuterVolumeSpecName: "kube-api-access-pd4lw") pod "ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd" (UID: "ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd"). InnerVolumeSpecName "kube-api-access-pd4lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.303931 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd" (UID: "ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.327035 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd" (UID: "ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.330092 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd" (UID: "ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.349266 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.349288 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.349298 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.349311 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd4lw\" (UniqueName: \"kubernetes.io/projected/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-kube-api-access-pd4lw\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.407433 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-config" (OuterVolumeSpecName: "config") pod "ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd" (UID: "ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.416034 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ww8wt" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.419976 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd" (UID: "ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.454001 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.454034 4776 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.462089 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7qvb2" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.555026 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-config-data\") pod \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\" (UID: \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\") " Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.555109 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-config-data\") pod \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\" (UID: \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\") " Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.555209 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-scripts\") pod \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\" (UID: \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\") " Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.555285 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpfbg\" (UniqueName: \"kubernetes.io/projected/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-kube-api-access-xpfbg\") pod \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\" (UID: \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\") " Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.555315 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-scripts\") pod \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\" (UID: \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\") " Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.555342 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-combined-ca-bundle\") pod \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\" (UID: \"21b0d9dc-05c6-417b-adaa-85d82bf95aeb\") " Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.555404 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-combined-ca-bundle\") pod \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\" (UID: \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\") " Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.555423 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc8rl\" (UniqueName: \"kubernetes.io/projected/bf66e818-91d4-4a50-b10e-b40f0dc8754d-kube-api-access-gc8rl\") pod \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\" (UID: \"bf66e818-91d4-4a50-b10e-b40f0dc8754d\") " Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.559015 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-scripts" (OuterVolumeSpecName: "scripts") pod "bf66e818-91d4-4a50-b10e-b40f0dc8754d" (UID: "bf66e818-91d4-4a50-b10e-b40f0dc8754d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.559382 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf66e818-91d4-4a50-b10e-b40f0dc8754d-kube-api-access-gc8rl" (OuterVolumeSpecName: "kube-api-access-gc8rl") pod "bf66e818-91d4-4a50-b10e-b40f0dc8754d" (UID: "bf66e818-91d4-4a50-b10e-b40f0dc8754d"). InnerVolumeSpecName "kube-api-access-gc8rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.559507 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-scripts" (OuterVolumeSpecName: "scripts") pod "21b0d9dc-05c6-417b-adaa-85d82bf95aeb" (UID: "21b0d9dc-05c6-417b-adaa-85d82bf95aeb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.560654 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-kube-api-access-xpfbg" (OuterVolumeSpecName: "kube-api-access-xpfbg") pod "21b0d9dc-05c6-417b-adaa-85d82bf95aeb" (UID: "21b0d9dc-05c6-417b-adaa-85d82bf95aeb"). InnerVolumeSpecName "kube-api-access-xpfbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.583477 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf66e818-91d4-4a50-b10e-b40f0dc8754d" (UID: "bf66e818-91d4-4a50-b10e-b40f0dc8754d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.586270 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-config-data" (OuterVolumeSpecName: "config-data") pod "21b0d9dc-05c6-417b-adaa-85d82bf95aeb" (UID: "21b0d9dc-05c6-417b-adaa-85d82bf95aeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.587739 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21b0d9dc-05c6-417b-adaa-85d82bf95aeb" (UID: "21b0d9dc-05c6-417b-adaa-85d82bf95aeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.592030 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-config-data" (OuterVolumeSpecName: "config-data") pod "bf66e818-91d4-4a50-b10e-b40f0dc8754d" (UID: "bf66e818-91d4-4a50-b10e-b40f0dc8754d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.658185 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.658214 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc8rl\" (UniqueName: \"kubernetes.io/projected/bf66e818-91d4-4a50-b10e-b40f0dc8754d-kube-api-access-gc8rl\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.658226 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.658235 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.658244 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.658252 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpfbg\" (UniqueName: \"kubernetes.io/projected/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-kube-api-access-xpfbg\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.658263 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf66e818-91d4-4a50-b10e-b40f0dc8754d-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.658272 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21b0d9dc-05c6-417b-adaa-85d82bf95aeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.997684 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ww8wt" event={"ID":"21b0d9dc-05c6-417b-adaa-85d82bf95aeb","Type":"ContainerDied","Data":"5567256e8b79e2a1cce6ec3bd1b089999b4d09a403ecee1402eaaf19287b5cbc"} Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.998005 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5567256e8b79e2a1cce6ec3bd1b089999b4d09a403ecee1402eaaf19287b5cbc" Jan 28 07:10:07 crc kubenswrapper[4776]: I0128 07:10:07.998053 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ww8wt" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.002685 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7qvb2" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.002695 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7qvb2" event={"ID":"bf66e818-91d4-4a50-b10e-b40f0dc8754d","Type":"ContainerDied","Data":"0ef854ea48969e654ebc62720b207684db1fd265795279f8e354b5ae5d914b30"} Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.002743 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ef854ea48969e654ebc62720b207684db1fd265795279f8e354b5ae5d914b30" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.007659 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.009914 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jklmf" event={"ID":"ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd","Type":"ContainerDied","Data":"d696e3e8455feef20c6c6b1208786b41f2289d81cff0f9a09b550f0c0cf36998"} Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.009996 4776 scope.go:117] "RemoveContainer" containerID="94e09a9c1472e2917b8bbddf28d644810bebee8c18193b3624497336fae60209" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.069811 4776 scope.go:117] "RemoveContainer" containerID="00a7e038847214ef057a4d05eed478b4719250f7075da6593871df61060a45d3" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.103303 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jklmf"] Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.111013 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jklmf"] Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.120907 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 28 07:10:08 crc kubenswrapper[4776]: E0128 07:10:08.121411 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b0d9dc-05c6-417b-adaa-85d82bf95aeb" containerName="nova-cell1-conductor-db-sync" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.121435 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b0d9dc-05c6-417b-adaa-85d82bf95aeb" containerName="nova-cell1-conductor-db-sync" Jan 28 07:10:08 crc kubenswrapper[4776]: E0128 07:10:08.121453 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd" containerName="init" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.121461 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd" containerName="init" Jan 28 07:10:08 crc kubenswrapper[4776]: E0128 07:10:08.121474 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd" containerName="dnsmasq-dns" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.121480 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd" containerName="dnsmasq-dns" Jan 28 07:10:08 crc kubenswrapper[4776]: E0128 07:10:08.121491 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf66e818-91d4-4a50-b10e-b40f0dc8754d" containerName="nova-manage" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.121497 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf66e818-91d4-4a50-b10e-b40f0dc8754d" containerName="nova-manage" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.128833 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd" containerName="dnsmasq-dns" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.128897 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf66e818-91d4-4a50-b10e-b40f0dc8754d" containerName="nova-manage" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.128917 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b0d9dc-05c6-417b-adaa-85d82bf95aeb" containerName="nova-cell1-conductor-db-sync" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.131853 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.141282 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.167918 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.254909 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.255525 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="92e5558c-b4d0-4688-bbbe-89da202f6c56" containerName="nova-api-log" containerID="cri-o://35b2655d70ebcd2a7ec9d07be917635f57b04a5d592a537d2ab2320c9efc0517" gracePeriod=30 Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.255703 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="92e5558c-b4d0-4688-bbbe-89da202f6c56" containerName="nova-api-api" containerID="cri-o://630011fce5539fafb71d6ab076243d692a62b66f8575f2a229b500205a3ab931" gracePeriod=30 Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.270413 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f77e4a-2026-45c0-80c8-a5d8b18046df-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"10f77e4a-2026-45c0-80c8-a5d8b18046df\") " pod="openstack/nova-cell1-conductor-0" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.270470 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10f77e4a-2026-45c0-80c8-a5d8b18046df-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"10f77e4a-2026-45c0-80c8-a5d8b18046df\") " pod="openstack/nova-cell1-conductor-0" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.270489 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5wk4\" (UniqueName: \"kubernetes.io/projected/10f77e4a-2026-45c0-80c8-a5d8b18046df-kube-api-access-p5wk4\") pod \"nova-cell1-conductor-0\" (UID: \"10f77e4a-2026-45c0-80c8-a5d8b18046df\") " pod="openstack/nova-cell1-conductor-0" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.279981 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.372864 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f77e4a-2026-45c0-80c8-a5d8b18046df-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"10f77e4a-2026-45c0-80c8-a5d8b18046df\") " pod="openstack/nova-cell1-conductor-0" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.372913 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10f77e4a-2026-45c0-80c8-a5d8b18046df-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"10f77e4a-2026-45c0-80c8-a5d8b18046df\") " pod="openstack/nova-cell1-conductor-0" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.372930 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5wk4\" (UniqueName: \"kubernetes.io/projected/10f77e4a-2026-45c0-80c8-a5d8b18046df-kube-api-access-p5wk4\") pod \"nova-cell1-conductor-0\" (UID: \"10f77e4a-2026-45c0-80c8-a5d8b18046df\") " pod="openstack/nova-cell1-conductor-0" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.378323 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10f77e4a-2026-45c0-80c8-a5d8b18046df-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"10f77e4a-2026-45c0-80c8-a5d8b18046df\") " pod="openstack/nova-cell1-conductor-0" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.378472 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f77e4a-2026-45c0-80c8-a5d8b18046df-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"10f77e4a-2026-45c0-80c8-a5d8b18046df\") " pod="openstack/nova-cell1-conductor-0" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.392390 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5wk4\" (UniqueName: \"kubernetes.io/projected/10f77e4a-2026-45c0-80c8-a5d8b18046df-kube-api-access-p5wk4\") pod \"nova-cell1-conductor-0\" (UID: \"10f77e4a-2026-45c0-80c8-a5d8b18046df\") " pod="openstack/nova-cell1-conductor-0" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.464287 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 28 07:10:08 crc kubenswrapper[4776]: I0128 07:10:08.993378 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 28 07:10:09 crc kubenswrapper[4776]: I0128 07:10:09.028156 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"10f77e4a-2026-45c0-80c8-a5d8b18046df","Type":"ContainerStarted","Data":"0d82327cd667ce05b9a82192d1fcafeb4d71e25638ec0a90b09cf14a69984208"} Jan 28 07:10:09 crc kubenswrapper[4776]: I0128 07:10:09.041917 4776 generic.go:334] "Generic (PLEG): container finished" podID="92e5558c-b4d0-4688-bbbe-89da202f6c56" containerID="35b2655d70ebcd2a7ec9d07be917635f57b04a5d592a537d2ab2320c9efc0517" exitCode=143 Jan 28 07:10:09 crc kubenswrapper[4776]: I0128 07:10:09.042009 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"92e5558c-b4d0-4688-bbbe-89da202f6c56","Type":"ContainerDied","Data":"35b2655d70ebcd2a7ec9d07be917635f57b04a5d592a537d2ab2320c9efc0517"} Jan 28 07:10:09 crc kubenswrapper[4776]: I0128 07:10:09.042126 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ad3095c4-70f9-4dd6-8797-76f60a334942" containerName="nova-scheduler-scheduler" containerID="cri-o://cc5e9a9a71ce1a6f27052b931efa0ff4e8fd734419205ffc05a35b5109bb278c" gracePeriod=30 Jan 28 07:10:09 crc kubenswrapper[4776]: I0128 07:10:09.319457 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd" path="/var/lib/kubelet/pods/ee24ede0-05b1-4b07-a6b5-2d7fc8b0cefd/volumes" Jan 28 07:10:10 crc kubenswrapper[4776]: I0128 07:10:10.052816 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"10f77e4a-2026-45c0-80c8-a5d8b18046df","Type":"ContainerStarted","Data":"5fa6268fa05f32798f0a54d99602e31b45562b712edcb8b4638973eb2ab4eacc"} Jan 28 07:10:10 crc kubenswrapper[4776]: I0128 07:10:10.053701 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 28 07:10:10 crc kubenswrapper[4776]: I0128 07:10:10.074108 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.074085068 podStartE2EDuration="2.074085068s" podCreationTimestamp="2026-01-28 07:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:10:10.067003357 +0000 UTC m=+1181.482663517" watchObservedRunningTime="2026-01-28 07:10:10.074085068 +0000 UTC m=+1181.489745228" Jan 28 07:10:11 crc kubenswrapper[4776]: E0128 07:10:11.246378 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cc5e9a9a71ce1a6f27052b931efa0ff4e8fd734419205ffc05a35b5109bb278c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 28 07:10:11 crc kubenswrapper[4776]: E0128 07:10:11.250206 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cc5e9a9a71ce1a6f27052b931efa0ff4e8fd734419205ffc05a35b5109bb278c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 28 07:10:11 crc kubenswrapper[4776]: E0128 07:10:11.251667 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cc5e9a9a71ce1a6f27052b931efa0ff4e8fd734419205ffc05a35b5109bb278c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 28 07:10:11 crc kubenswrapper[4776]: E0128 07:10:11.251807 4776 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ad3095c4-70f9-4dd6-8797-76f60a334942" containerName="nova-scheduler-scheduler" Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.086125 4776 generic.go:334] "Generic (PLEG): container finished" podID="92e5558c-b4d0-4688-bbbe-89da202f6c56" containerID="630011fce5539fafb71d6ab076243d692a62b66f8575f2a229b500205a3ab931" exitCode=0 Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.086189 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"92e5558c-b4d0-4688-bbbe-89da202f6c56","Type":"ContainerDied","Data":"630011fce5539fafb71d6ab076243d692a62b66f8575f2a229b500205a3ab931"} Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.086925 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"92e5558c-b4d0-4688-bbbe-89da202f6c56","Type":"ContainerDied","Data":"72352250c3d7de55a7572ed0ca4300a8f90907041517aa237be868c0e7fb4c34"} Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.086941 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72352250c3d7de55a7572ed0ca4300a8f90907041517aa237be868c0e7fb4c34" Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.090138 4776 generic.go:334] "Generic (PLEG): container finished" podID="ad3095c4-70f9-4dd6-8797-76f60a334942" containerID="cc5e9a9a71ce1a6f27052b931efa0ff4e8fd734419205ffc05a35b5109bb278c" exitCode=0 Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.090168 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad3095c4-70f9-4dd6-8797-76f60a334942","Type":"ContainerDied","Data":"cc5e9a9a71ce1a6f27052b931efa0ff4e8fd734419205ffc05a35b5109bb278c"} Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.147790 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.270236 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92e5558c-b4d0-4688-bbbe-89da202f6c56-logs\") pod \"92e5558c-b4d0-4688-bbbe-89da202f6c56\" (UID: \"92e5558c-b4d0-4688-bbbe-89da202f6c56\") " Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.270355 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e5558c-b4d0-4688-bbbe-89da202f6c56-combined-ca-bundle\") pod \"92e5558c-b4d0-4688-bbbe-89da202f6c56\" (UID: \"92e5558c-b4d0-4688-bbbe-89da202f6c56\") " Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.270504 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjq7g\" (UniqueName: \"kubernetes.io/projected/92e5558c-b4d0-4688-bbbe-89da202f6c56-kube-api-access-gjq7g\") pod \"92e5558c-b4d0-4688-bbbe-89da202f6c56\" (UID: \"92e5558c-b4d0-4688-bbbe-89da202f6c56\") " Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.270534 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92e5558c-b4d0-4688-bbbe-89da202f6c56-config-data\") pod \"92e5558c-b4d0-4688-bbbe-89da202f6c56\" (UID: \"92e5558c-b4d0-4688-bbbe-89da202f6c56\") " Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.271128 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92e5558c-b4d0-4688-bbbe-89da202f6c56-logs" (OuterVolumeSpecName: "logs") pod "92e5558c-b4d0-4688-bbbe-89da202f6c56" (UID: "92e5558c-b4d0-4688-bbbe-89da202f6c56"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.287434 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92e5558c-b4d0-4688-bbbe-89da202f6c56-kube-api-access-gjq7g" (OuterVolumeSpecName: "kube-api-access-gjq7g") pod "92e5558c-b4d0-4688-bbbe-89da202f6c56" (UID: "92e5558c-b4d0-4688-bbbe-89da202f6c56"). InnerVolumeSpecName "kube-api-access-gjq7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.314117 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e5558c-b4d0-4688-bbbe-89da202f6c56-config-data" (OuterVolumeSpecName: "config-data") pod "92e5558c-b4d0-4688-bbbe-89da202f6c56" (UID: "92e5558c-b4d0-4688-bbbe-89da202f6c56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.318995 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e5558c-b4d0-4688-bbbe-89da202f6c56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92e5558c-b4d0-4688-bbbe-89da202f6c56" (UID: "92e5558c-b4d0-4688-bbbe-89da202f6c56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.373426 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e5558c-b4d0-4688-bbbe-89da202f6c56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.373785 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjq7g\" (UniqueName: \"kubernetes.io/projected/92e5558c-b4d0-4688-bbbe-89da202f6c56-kube-api-access-gjq7g\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.373796 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92e5558c-b4d0-4688-bbbe-89da202f6c56-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.373803 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92e5558c-b4d0-4688-bbbe-89da202f6c56-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.435839 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.475251 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzwmn\" (UniqueName: \"kubernetes.io/projected/ad3095c4-70f9-4dd6-8797-76f60a334942-kube-api-access-wzwmn\") pod \"ad3095c4-70f9-4dd6-8797-76f60a334942\" (UID: \"ad3095c4-70f9-4dd6-8797-76f60a334942\") " Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.475419 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3095c4-70f9-4dd6-8797-76f60a334942-combined-ca-bundle\") pod \"ad3095c4-70f9-4dd6-8797-76f60a334942\" (UID: \"ad3095c4-70f9-4dd6-8797-76f60a334942\") " Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.475627 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3095c4-70f9-4dd6-8797-76f60a334942-config-data\") pod \"ad3095c4-70f9-4dd6-8797-76f60a334942\" (UID: \"ad3095c4-70f9-4dd6-8797-76f60a334942\") " Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.494510 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad3095c4-70f9-4dd6-8797-76f60a334942-kube-api-access-wzwmn" (OuterVolumeSpecName: "kube-api-access-wzwmn") pod "ad3095c4-70f9-4dd6-8797-76f60a334942" (UID: "ad3095c4-70f9-4dd6-8797-76f60a334942"). InnerVolumeSpecName "kube-api-access-wzwmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.513297 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3095c4-70f9-4dd6-8797-76f60a334942-config-data" (OuterVolumeSpecName: "config-data") pod "ad3095c4-70f9-4dd6-8797-76f60a334942" (UID: "ad3095c4-70f9-4dd6-8797-76f60a334942"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.520927 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3095c4-70f9-4dd6-8797-76f60a334942-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad3095c4-70f9-4dd6-8797-76f60a334942" (UID: "ad3095c4-70f9-4dd6-8797-76f60a334942"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.579165 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzwmn\" (UniqueName: \"kubernetes.io/projected/ad3095c4-70f9-4dd6-8797-76f60a334942-kube-api-access-wzwmn\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.579234 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3095c4-70f9-4dd6-8797-76f60a334942-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:13 crc kubenswrapper[4776]: I0128 07:10:13.579253 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3095c4-70f9-4dd6-8797-76f60a334942-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.103085 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.103142 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.103085 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad3095c4-70f9-4dd6-8797-76f60a334942","Type":"ContainerDied","Data":"c2820e0567bfdc073fb3d0433d4e5c288821a48cf79c0339b2d72d4b5c1998f8"} Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.104086 4776 scope.go:117] "RemoveContainer" containerID="cc5e9a9a71ce1a6f27052b931efa0ff4e8fd734419205ffc05a35b5109bb278c" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.135059 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.152388 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.163674 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 28 07:10:14 crc kubenswrapper[4776]: E0128 07:10:14.164122 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad3095c4-70f9-4dd6-8797-76f60a334942" containerName="nova-scheduler-scheduler" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.164147 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3095c4-70f9-4dd6-8797-76f60a334942" containerName="nova-scheduler-scheduler" Jan 28 07:10:14 crc kubenswrapper[4776]: E0128 07:10:14.164172 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92e5558c-b4d0-4688-bbbe-89da202f6c56" containerName="nova-api-log" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.164180 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e5558c-b4d0-4688-bbbe-89da202f6c56" containerName="nova-api-log" Jan 28 07:10:14 crc kubenswrapper[4776]: E0128 07:10:14.164217 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92e5558c-b4d0-4688-bbbe-89da202f6c56" containerName="nova-api-api" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.164225 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e5558c-b4d0-4688-bbbe-89da202f6c56" containerName="nova-api-api" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.164474 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="92e5558c-b4d0-4688-bbbe-89da202f6c56" containerName="nova-api-log" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.164500 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="92e5558c-b4d0-4688-bbbe-89da202f6c56" containerName="nova-api-api" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.164517 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad3095c4-70f9-4dd6-8797-76f60a334942" containerName="nova-scheduler-scheduler" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.165756 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.169491 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.175934 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.189798 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.190837 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30578448-fa5a-4aca-9318-9eb78cc0a9ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\") " pod="openstack/nova-api-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.190949 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rgq5\" (UniqueName: \"kubernetes.io/projected/30578448-fa5a-4aca-9318-9eb78cc0a9ca-kube-api-access-6rgq5\") pod \"nova-api-0\" (UID: \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\") " pod="openstack/nova-api-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.191063 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30578448-fa5a-4aca-9318-9eb78cc0a9ca-config-data\") pod \"nova-api-0\" (UID: \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\") " pod="openstack/nova-api-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.191108 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30578448-fa5a-4aca-9318-9eb78cc0a9ca-logs\") pod \"nova-api-0\" (UID: \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\") " pod="openstack/nova-api-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.201390 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.221675 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.224391 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.229634 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.259781 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.292892 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30578448-fa5a-4aca-9318-9eb78cc0a9ca-config-data\") pod \"nova-api-0\" (UID: \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\") " pod="openstack/nova-api-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.292975 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30578448-fa5a-4aca-9318-9eb78cc0a9ca-logs\") pod \"nova-api-0\" (UID: \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\") " pod="openstack/nova-api-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.293185 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30578448-fa5a-4aca-9318-9eb78cc0a9ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\") " pod="openstack/nova-api-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.293257 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.293383 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-config-data\") pod \"nova-scheduler-0\" (UID: \"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.293469 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rgq5\" (UniqueName: \"kubernetes.io/projected/30578448-fa5a-4aca-9318-9eb78cc0a9ca-kube-api-access-6rgq5\") pod \"nova-api-0\" (UID: \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\") " pod="openstack/nova-api-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.293597 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgvmn\" (UniqueName: \"kubernetes.io/projected/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-kube-api-access-jgvmn\") pod \"nova-scheduler-0\" (UID: \"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.294098 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30578448-fa5a-4aca-9318-9eb78cc0a9ca-logs\") pod \"nova-api-0\" (UID: \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\") " pod="openstack/nova-api-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.298371 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30578448-fa5a-4aca-9318-9eb78cc0a9ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\") " pod="openstack/nova-api-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.306747 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30578448-fa5a-4aca-9318-9eb78cc0a9ca-config-data\") pod \"nova-api-0\" (UID: \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\") " pod="openstack/nova-api-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.311932 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rgq5\" (UniqueName: \"kubernetes.io/projected/30578448-fa5a-4aca-9318-9eb78cc0a9ca-kube-api-access-6rgq5\") pod \"nova-api-0\" (UID: \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\") " pod="openstack/nova-api-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.395061 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.395197 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-config-data\") pod \"nova-scheduler-0\" (UID: \"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.395276 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgvmn\" (UniqueName: \"kubernetes.io/projected/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-kube-api-access-jgvmn\") pod \"nova-scheduler-0\" (UID: \"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.399507 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.399985 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-config-data\") pod \"nova-scheduler-0\" (UID: \"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.411263 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgvmn\" (UniqueName: \"kubernetes.io/projected/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-kube-api-access-jgvmn\") pod \"nova-scheduler-0\" (UID: \"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.506126 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.544075 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 07:10:14 crc kubenswrapper[4776]: I0128 07:10:14.979396 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 07:10:15 crc kubenswrapper[4776]: I0128 07:10:15.040756 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 07:10:15 crc kubenswrapper[4776]: W0128 07:10:15.046478 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode28c37f2_b73d_4b7e_b8c8_82baffe4d9ce.slice/crio-8cadda9a7b4d4009d05e5d9ab512af2c413d26f330dde59f2ddc0274c2a2b2f1 WatchSource:0}: Error finding container 8cadda9a7b4d4009d05e5d9ab512af2c413d26f330dde59f2ddc0274c2a2b2f1: Status 404 returned error can't find the container with id 8cadda9a7b4d4009d05e5d9ab512af2c413d26f330dde59f2ddc0274c2a2b2f1 Jan 28 07:10:15 crc kubenswrapper[4776]: I0128 07:10:15.123183 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce","Type":"ContainerStarted","Data":"8cadda9a7b4d4009d05e5d9ab512af2c413d26f330dde59f2ddc0274c2a2b2f1"} Jan 28 07:10:15 crc kubenswrapper[4776]: I0128 07:10:15.129216 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30578448-fa5a-4aca-9318-9eb78cc0a9ca","Type":"ContainerStarted","Data":"4093f3114711ff28330bfa58a4f9776ca3e3725be93f2c7afca80fe2a5457ff9"} Jan 28 07:10:15 crc kubenswrapper[4776]: I0128 07:10:15.325042 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92e5558c-b4d0-4688-bbbe-89da202f6c56" path="/var/lib/kubelet/pods/92e5558c-b4d0-4688-bbbe-89da202f6c56/volumes" Jan 28 07:10:15 crc kubenswrapper[4776]: I0128 07:10:15.326255 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad3095c4-70f9-4dd6-8797-76f60a334942" path="/var/lib/kubelet/pods/ad3095c4-70f9-4dd6-8797-76f60a334942/volumes" Jan 28 07:10:16 crc kubenswrapper[4776]: I0128 07:10:16.139618 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30578448-fa5a-4aca-9318-9eb78cc0a9ca","Type":"ContainerStarted","Data":"1735ec56e5ab92b5a3534447e80bebbbb91f1b575146a7e90a7d8948049553a1"} Jan 28 07:10:16 crc kubenswrapper[4776]: I0128 07:10:16.139665 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30578448-fa5a-4aca-9318-9eb78cc0a9ca","Type":"ContainerStarted","Data":"717ee8b2af62b71284befbcaa5e4828d54c5621883195ceef360ca0b60809d2f"} Jan 28 07:10:16 crc kubenswrapper[4776]: I0128 07:10:16.141649 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce","Type":"ContainerStarted","Data":"86e58c5263e9f7aecd0c2be607f20ad6d9a31d70dd29c5d2d85d9fff8acb7806"} Jan 28 07:10:16 crc kubenswrapper[4776]: I0128 07:10:16.175905 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.175883678 podStartE2EDuration="2.175883678s" podCreationTimestamp="2026-01-28 07:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:10:16.162039384 +0000 UTC m=+1187.577699554" watchObservedRunningTime="2026-01-28 07:10:16.175883678 +0000 UTC m=+1187.591543838" Jan 28 07:10:16 crc kubenswrapper[4776]: I0128 07:10:16.179390 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.179365102 podStartE2EDuration="2.179365102s" podCreationTimestamp="2026-01-28 07:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:10:16.178028895 +0000 UTC m=+1187.593689075" watchObservedRunningTime="2026-01-28 07:10:16.179365102 +0000 UTC m=+1187.595025262" Jan 28 07:10:18 crc kubenswrapper[4776]: I0128 07:10:18.508529 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 28 07:10:19 crc kubenswrapper[4776]: I0128 07:10:19.545176 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 28 07:10:23 crc kubenswrapper[4776]: I0128 07:10:23.392157 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 28 07:10:24 crc kubenswrapper[4776]: I0128 07:10:24.507242 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 07:10:24 crc kubenswrapper[4776]: I0128 07:10:24.507697 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 07:10:24 crc kubenswrapper[4776]: I0128 07:10:24.545236 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 28 07:10:24 crc kubenswrapper[4776]: I0128 07:10:24.575477 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 28 07:10:25 crc kubenswrapper[4776]: I0128 07:10:25.317857 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 28 07:10:25 crc kubenswrapper[4776]: I0128 07:10:25.590748 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="30578448-fa5a-4aca-9318-9eb78cc0a9ca" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 07:10:25 crc kubenswrapper[4776]: I0128 07:10:25.590801 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="30578448-fa5a-4aca-9318-9eb78cc0a9ca" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 07:10:27 crc kubenswrapper[4776]: I0128 07:10:27.403891 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 07:10:27 crc kubenswrapper[4776]: I0128 07:10:27.405585 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="9c3e7326-8de9-4923-baae-72484416a58e" containerName="kube-state-metrics" containerID="cri-o://ae2ae2e12fce3bbdcda2070e7d8681b9ef2a01e8c68f05397218a734a5b1c4e5" gracePeriod=30 Jan 28 07:10:27 crc kubenswrapper[4776]: I0128 07:10:27.426070 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="9c3e7326-8de9-4923-baae-72484416a58e" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": dial tcp 10.217.0.108:8081: connect: connection refused" Jan 28 07:10:27 crc kubenswrapper[4776]: I0128 07:10:27.915945 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.071039 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zk8s\" (UniqueName: \"kubernetes.io/projected/9c3e7326-8de9-4923-baae-72484416a58e-kube-api-access-2zk8s\") pod \"9c3e7326-8de9-4923-baae-72484416a58e\" (UID: \"9c3e7326-8de9-4923-baae-72484416a58e\") " Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.082756 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c3e7326-8de9-4923-baae-72484416a58e-kube-api-access-2zk8s" (OuterVolumeSpecName: "kube-api-access-2zk8s") pod "9c3e7326-8de9-4923-baae-72484416a58e" (UID: "9c3e7326-8de9-4923-baae-72484416a58e"). InnerVolumeSpecName "kube-api-access-2zk8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.173997 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zk8s\" (UniqueName: \"kubernetes.io/projected/9c3e7326-8de9-4923-baae-72484416a58e-kube-api-access-2zk8s\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.305355 4776 generic.go:334] "Generic (PLEG): container finished" podID="9c3e7326-8de9-4923-baae-72484416a58e" containerID="ae2ae2e12fce3bbdcda2070e7d8681b9ef2a01e8c68f05397218a734a5b1c4e5" exitCode=2 Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.305398 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9c3e7326-8de9-4923-baae-72484416a58e","Type":"ContainerDied","Data":"ae2ae2e12fce3bbdcda2070e7d8681b9ef2a01e8c68f05397218a734a5b1c4e5"} Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.305425 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9c3e7326-8de9-4923-baae-72484416a58e","Type":"ContainerDied","Data":"6b541f073377216d9d36f31f2ab3a9284023f3f535a47398a1b1229d83653a7c"} Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.305444 4776 scope.go:117] "RemoveContainer" containerID="ae2ae2e12fce3bbdcda2070e7d8681b9ef2a01e8c68f05397218a734a5b1c4e5" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.305836 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.330752 4776 scope.go:117] "RemoveContainer" containerID="ae2ae2e12fce3bbdcda2070e7d8681b9ef2a01e8c68f05397218a734a5b1c4e5" Jan 28 07:10:28 crc kubenswrapper[4776]: E0128 07:10:28.331295 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae2ae2e12fce3bbdcda2070e7d8681b9ef2a01e8c68f05397218a734a5b1c4e5\": container with ID starting with ae2ae2e12fce3bbdcda2070e7d8681b9ef2a01e8c68f05397218a734a5b1c4e5 not found: ID does not exist" containerID="ae2ae2e12fce3bbdcda2070e7d8681b9ef2a01e8c68f05397218a734a5b1c4e5" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.331336 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2ae2e12fce3bbdcda2070e7d8681b9ef2a01e8c68f05397218a734a5b1c4e5"} err="failed to get container status \"ae2ae2e12fce3bbdcda2070e7d8681b9ef2a01e8c68f05397218a734a5b1c4e5\": rpc error: code = NotFound desc = could not find container \"ae2ae2e12fce3bbdcda2070e7d8681b9ef2a01e8c68f05397218a734a5b1c4e5\": container with ID starting with ae2ae2e12fce3bbdcda2070e7d8681b9ef2a01e8c68f05397218a734a5b1c4e5 not found: ID does not exist" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.348627 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.359981 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.389650 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 07:10:28 crc kubenswrapper[4776]: E0128 07:10:28.390527 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3e7326-8de9-4923-baae-72484416a58e" containerName="kube-state-metrics" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.390568 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3e7326-8de9-4923-baae-72484416a58e" containerName="kube-state-metrics" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.391048 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c3e7326-8de9-4923-baae-72484416a58e" containerName="kube-state-metrics" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.392272 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.395669 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.415837 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.438675 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.497649 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c15b0ff9-2ff0-4eed-821d-ba0da8122d6d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c15b0ff9-2ff0-4eed-821d-ba0da8122d6d\") " pod="openstack/kube-state-metrics-0" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.497701 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c15b0ff9-2ff0-4eed-821d-ba0da8122d6d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c15b0ff9-2ff0-4eed-821d-ba0da8122d6d\") " pod="openstack/kube-state-metrics-0" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.497787 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c15b0ff9-2ff0-4eed-821d-ba0da8122d6d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c15b0ff9-2ff0-4eed-821d-ba0da8122d6d\") " pod="openstack/kube-state-metrics-0" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.497811 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2hl6\" (UniqueName: \"kubernetes.io/projected/c15b0ff9-2ff0-4eed-821d-ba0da8122d6d-kube-api-access-m2hl6\") pod \"kube-state-metrics-0\" (UID: \"c15b0ff9-2ff0-4eed-821d-ba0da8122d6d\") " pod="openstack/kube-state-metrics-0" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.599329 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c15b0ff9-2ff0-4eed-821d-ba0da8122d6d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c15b0ff9-2ff0-4eed-821d-ba0da8122d6d\") " pod="openstack/kube-state-metrics-0" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.599379 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2hl6\" (UniqueName: \"kubernetes.io/projected/c15b0ff9-2ff0-4eed-821d-ba0da8122d6d-kube-api-access-m2hl6\") pod \"kube-state-metrics-0\" (UID: \"c15b0ff9-2ff0-4eed-821d-ba0da8122d6d\") " pod="openstack/kube-state-metrics-0" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.599481 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c15b0ff9-2ff0-4eed-821d-ba0da8122d6d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c15b0ff9-2ff0-4eed-821d-ba0da8122d6d\") " pod="openstack/kube-state-metrics-0" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.599508 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c15b0ff9-2ff0-4eed-821d-ba0da8122d6d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c15b0ff9-2ff0-4eed-821d-ba0da8122d6d\") " pod="openstack/kube-state-metrics-0" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.603649 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c15b0ff9-2ff0-4eed-821d-ba0da8122d6d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c15b0ff9-2ff0-4eed-821d-ba0da8122d6d\") " pod="openstack/kube-state-metrics-0" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.604669 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c15b0ff9-2ff0-4eed-821d-ba0da8122d6d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c15b0ff9-2ff0-4eed-821d-ba0da8122d6d\") " pod="openstack/kube-state-metrics-0" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.617952 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c15b0ff9-2ff0-4eed-821d-ba0da8122d6d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c15b0ff9-2ff0-4eed-821d-ba0da8122d6d\") " pod="openstack/kube-state-metrics-0" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.626747 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2hl6\" (UniqueName: \"kubernetes.io/projected/c15b0ff9-2ff0-4eed-821d-ba0da8122d6d-kube-api-access-m2hl6\") pod \"kube-state-metrics-0\" (UID: \"c15b0ff9-2ff0-4eed-821d-ba0da8122d6d\") " pod="openstack/kube-state-metrics-0" Jan 28 07:10:28 crc kubenswrapper[4776]: I0128 07:10:28.814113 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 07:10:29 crc kubenswrapper[4776]: W0128 07:10:29.329444 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc15b0ff9_2ff0_4eed_821d_ba0da8122d6d.slice/crio-bceb806264d88cb21c245251e2a139c9f541b32ebc5b821ea4c09a008a211949 WatchSource:0}: Error finding container bceb806264d88cb21c245251e2a139c9f541b32ebc5b821ea4c09a008a211949: Status 404 returned error can't find the container with id bceb806264d88cb21c245251e2a139c9f541b32ebc5b821ea4c09a008a211949 Jan 28 07:10:29 crc kubenswrapper[4776]: I0128 07:10:29.332888 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 07:10:29 crc kubenswrapper[4776]: I0128 07:10:29.333727 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c3e7326-8de9-4923-baae-72484416a58e" path="/var/lib/kubelet/pods/9c3e7326-8de9-4923-baae-72484416a58e/volumes" Jan 28 07:10:29 crc kubenswrapper[4776]: I0128 07:10:29.334944 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 07:10:29 crc kubenswrapper[4776]: I0128 07:10:29.420683 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:10:29 crc kubenswrapper[4776]: I0128 07:10:29.420955 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerName="ceilometer-central-agent" containerID="cri-o://ccbac3e77006da25eafd54b1a36432127861753cd17b0a5e7c18593559bc9bcf" gracePeriod=30 Jan 28 07:10:29 crc kubenswrapper[4776]: I0128 07:10:29.421340 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerName="ceilometer-notification-agent" containerID="cri-o://cfeeabf3a3202d8f5d081cb49d65dc7fb5054ef80e9717964236e58462c7c86f" gracePeriod=30 Jan 28 07:10:29 crc kubenswrapper[4776]: I0128 07:10:29.421382 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerName="sg-core" containerID="cri-o://acdfad40b8c2e5c3e509613efb471c522c15ea63768d75ba8b38aceb4087f71c" gracePeriod=30 Jan 28 07:10:29 crc kubenswrapper[4776]: I0128 07:10:29.421438 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerName="proxy-httpd" containerID="cri-o://80a0e5d91eef9b50c58a931fe2e185b7dcfa3cf78580c301a919abbae7b3ec8d" gracePeriod=30 Jan 28 07:10:30 crc kubenswrapper[4776]: I0128 07:10:30.341348 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c15b0ff9-2ff0-4eed-821d-ba0da8122d6d","Type":"ContainerStarted","Data":"2d60ed1923567f6746d1900e9fad827a39797a26e3aa58ce058a6d690b45b548"} Jan 28 07:10:30 crc kubenswrapper[4776]: I0128 07:10:30.341772 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 28 07:10:30 crc kubenswrapper[4776]: I0128 07:10:30.341788 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c15b0ff9-2ff0-4eed-821d-ba0da8122d6d","Type":"ContainerStarted","Data":"bceb806264d88cb21c245251e2a139c9f541b32ebc5b821ea4c09a008a211949"} Jan 28 07:10:30 crc kubenswrapper[4776]: I0128 07:10:30.345579 4776 generic.go:334] "Generic (PLEG): container finished" podID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerID="80a0e5d91eef9b50c58a931fe2e185b7dcfa3cf78580c301a919abbae7b3ec8d" exitCode=0 Jan 28 07:10:30 crc kubenswrapper[4776]: I0128 07:10:30.345610 4776 generic.go:334] "Generic (PLEG): container finished" podID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerID="acdfad40b8c2e5c3e509613efb471c522c15ea63768d75ba8b38aceb4087f71c" exitCode=2 Jan 28 07:10:30 crc kubenswrapper[4776]: I0128 07:10:30.345617 4776 generic.go:334] "Generic (PLEG): container finished" podID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerID="ccbac3e77006da25eafd54b1a36432127861753cd17b0a5e7c18593559bc9bcf" exitCode=0 Jan 28 07:10:30 crc kubenswrapper[4776]: I0128 07:10:30.345633 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e","Type":"ContainerDied","Data":"80a0e5d91eef9b50c58a931fe2e185b7dcfa3cf78580c301a919abbae7b3ec8d"} Jan 28 07:10:30 crc kubenswrapper[4776]: I0128 07:10:30.345660 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e","Type":"ContainerDied","Data":"acdfad40b8c2e5c3e509613efb471c522c15ea63768d75ba8b38aceb4087f71c"} Jan 28 07:10:30 crc kubenswrapper[4776]: I0128 07:10:30.345670 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e","Type":"ContainerDied","Data":"ccbac3e77006da25eafd54b1a36432127861753cd17b0a5e7c18593559bc9bcf"} Jan 28 07:10:30 crc kubenswrapper[4776]: I0128 07:10:30.361823 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.001483589 podStartE2EDuration="2.361803034s" podCreationTimestamp="2026-01-28 07:10:28 +0000 UTC" firstStartedPulling="2026-01-28 07:10:29.332509955 +0000 UTC m=+1200.748170135" lastFinishedPulling="2026-01-28 07:10:29.69282938 +0000 UTC m=+1201.108489580" observedRunningTime="2026-01-28 07:10:30.356938823 +0000 UTC m=+1201.772598983" watchObservedRunningTime="2026-01-28 07:10:30.361803034 +0000 UTC m=+1201.777463204" Jan 28 07:10:32 crc kubenswrapper[4776]: E0128 07:10:32.233445 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb496787c_32af_4ac1_b15d_814ee1abc744.slice/crio-9afeb46ec6221baa5d09d716d85530c311c743b5e7b9c4cd0c5b9366a3c36871.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb496787c_32af_4ac1_b15d_814ee1abc744.slice/crio-conmon-9afeb46ec6221baa5d09d716d85530c311c743b5e7b9c4cd0c5b9366a3c36871.scope\": RecentStats: unable to find data in memory cache]" Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.370671 4776 generic.go:334] "Generic (PLEG): container finished" podID="b496787c-32af-4ac1-b15d-814ee1abc744" containerID="9afeb46ec6221baa5d09d716d85530c311c743b5e7b9c4cd0c5b9366a3c36871" exitCode=137 Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.370783 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b496787c-32af-4ac1-b15d-814ee1abc744","Type":"ContainerDied","Data":"9afeb46ec6221baa5d09d716d85530c311c743b5e7b9c4cd0c5b9366a3c36871"} Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.371126 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b496787c-32af-4ac1-b15d-814ee1abc744","Type":"ContainerDied","Data":"bf8ff259d4c2f217361c5c94f2736c5d8580d3ef93d92f3d765e39ee311768a1"} Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.371142 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf8ff259d4c2f217361c5c94f2736c5d8580d3ef93d92f3d765e39ee311768a1" Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.375730 4776 generic.go:334] "Generic (PLEG): container finished" podID="13487701-e252-4b0c-8991-df2864bcbde5" containerID="ae29791033ac7b5266aac2df203bfbb21af45f246276c8ea06f0b5ccdb3a99a4" exitCode=137 Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.375768 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13487701-e252-4b0c-8991-df2864bcbde5","Type":"ContainerDied","Data":"ae29791033ac7b5266aac2df203bfbb21af45f246276c8ea06f0b5ccdb3a99a4"} Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.448021 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.453083 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.489811 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b496787c-32af-4ac1-b15d-814ee1abc744-combined-ca-bundle\") pod \"b496787c-32af-4ac1-b15d-814ee1abc744\" (UID: \"b496787c-32af-4ac1-b15d-814ee1abc744\") " Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.534368 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b496787c-32af-4ac1-b15d-814ee1abc744-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b496787c-32af-4ac1-b15d-814ee1abc744" (UID: "b496787c-32af-4ac1-b15d-814ee1abc744"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.591265 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l42r\" (UniqueName: \"kubernetes.io/projected/13487701-e252-4b0c-8991-df2864bcbde5-kube-api-access-2l42r\") pod \"13487701-e252-4b0c-8991-df2864bcbde5\" (UID: \"13487701-e252-4b0c-8991-df2864bcbde5\") " Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.591342 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13487701-e252-4b0c-8991-df2864bcbde5-combined-ca-bundle\") pod \"13487701-e252-4b0c-8991-df2864bcbde5\" (UID: \"13487701-e252-4b0c-8991-df2864bcbde5\") " Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.591393 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13487701-e252-4b0c-8991-df2864bcbde5-config-data\") pod \"13487701-e252-4b0c-8991-df2864bcbde5\" (UID: \"13487701-e252-4b0c-8991-df2864bcbde5\") " Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.591581 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13487701-e252-4b0c-8991-df2864bcbde5-logs\") pod \"13487701-e252-4b0c-8991-df2864bcbde5\" (UID: \"13487701-e252-4b0c-8991-df2864bcbde5\") " Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.591641 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b496787c-32af-4ac1-b15d-814ee1abc744-config-data\") pod \"b496787c-32af-4ac1-b15d-814ee1abc744\" (UID: \"b496787c-32af-4ac1-b15d-814ee1abc744\") " Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.591667 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhmcf\" (UniqueName: \"kubernetes.io/projected/b496787c-32af-4ac1-b15d-814ee1abc744-kube-api-access-lhmcf\") pod \"b496787c-32af-4ac1-b15d-814ee1abc744\" (UID: \"b496787c-32af-4ac1-b15d-814ee1abc744\") " Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.592088 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b496787c-32af-4ac1-b15d-814ee1abc744-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.592768 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13487701-e252-4b0c-8991-df2864bcbde5-logs" (OuterVolumeSpecName: "logs") pod "13487701-e252-4b0c-8991-df2864bcbde5" (UID: "13487701-e252-4b0c-8991-df2864bcbde5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.594949 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b496787c-32af-4ac1-b15d-814ee1abc744-kube-api-access-lhmcf" (OuterVolumeSpecName: "kube-api-access-lhmcf") pod "b496787c-32af-4ac1-b15d-814ee1abc744" (UID: "b496787c-32af-4ac1-b15d-814ee1abc744"). InnerVolumeSpecName "kube-api-access-lhmcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.597305 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13487701-e252-4b0c-8991-df2864bcbde5-kube-api-access-2l42r" (OuterVolumeSpecName: "kube-api-access-2l42r") pod "13487701-e252-4b0c-8991-df2864bcbde5" (UID: "13487701-e252-4b0c-8991-df2864bcbde5"). InnerVolumeSpecName "kube-api-access-2l42r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.623490 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13487701-e252-4b0c-8991-df2864bcbde5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13487701-e252-4b0c-8991-df2864bcbde5" (UID: "13487701-e252-4b0c-8991-df2864bcbde5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.624133 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13487701-e252-4b0c-8991-df2864bcbde5-config-data" (OuterVolumeSpecName: "config-data") pod "13487701-e252-4b0c-8991-df2864bcbde5" (UID: "13487701-e252-4b0c-8991-df2864bcbde5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.628794 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b496787c-32af-4ac1-b15d-814ee1abc744-config-data" (OuterVolumeSpecName: "config-data") pod "b496787c-32af-4ac1-b15d-814ee1abc744" (UID: "b496787c-32af-4ac1-b15d-814ee1abc744"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.695359 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13487701-e252-4b0c-8991-df2864bcbde5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.695398 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13487701-e252-4b0c-8991-df2864bcbde5-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.695409 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13487701-e252-4b0c-8991-df2864bcbde5-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.695421 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b496787c-32af-4ac1-b15d-814ee1abc744-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.695432 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhmcf\" (UniqueName: \"kubernetes.io/projected/b496787c-32af-4ac1-b15d-814ee1abc744-kube-api-access-lhmcf\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:32 crc kubenswrapper[4776]: I0128 07:10:32.695443 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l42r\" (UniqueName: \"kubernetes.io/projected/13487701-e252-4b0c-8991-df2864bcbde5-kube-api-access-2l42r\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.388200 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.388448 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.388470 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13487701-e252-4b0c-8991-df2864bcbde5","Type":"ContainerDied","Data":"b2db7fe78a4eb7f461a68c4b9eafc26c5279295976b99f0539596bf4c81c8d48"} Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.389536 4776 scope.go:117] "RemoveContainer" containerID="ae29791033ac7b5266aac2df203bfbb21af45f246276c8ea06f0b5ccdb3a99a4" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.431967 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.436197 4776 scope.go:117] "RemoveContainer" containerID="107a91f95a36a34c3052a52e3a3d60ca9bddbd6b024399d92e9fe1ea49578c95" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.463709 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.505608 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.533683 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.549902 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 07:10:33 crc kubenswrapper[4776]: E0128 07:10:33.550439 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b496787c-32af-4ac1-b15d-814ee1abc744" containerName="nova-cell1-novncproxy-novncproxy" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.550464 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b496787c-32af-4ac1-b15d-814ee1abc744" containerName="nova-cell1-novncproxy-novncproxy" Jan 28 07:10:33 crc kubenswrapper[4776]: E0128 07:10:33.550480 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13487701-e252-4b0c-8991-df2864bcbde5" containerName="nova-metadata-metadata" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.550486 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="13487701-e252-4b0c-8991-df2864bcbde5" containerName="nova-metadata-metadata" Jan 28 07:10:33 crc kubenswrapper[4776]: E0128 07:10:33.550526 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13487701-e252-4b0c-8991-df2864bcbde5" containerName="nova-metadata-log" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.550532 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="13487701-e252-4b0c-8991-df2864bcbde5" containerName="nova-metadata-log" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.550809 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="13487701-e252-4b0c-8991-df2864bcbde5" containerName="nova-metadata-log" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.550832 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="b496787c-32af-4ac1-b15d-814ee1abc744" containerName="nova-cell1-novncproxy-novncproxy" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.550845 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="13487701-e252-4b0c-8991-df2864bcbde5" containerName="nova-metadata-metadata" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.551588 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.553339 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.553374 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.553346 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.562920 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.565779 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.568612 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.568769 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.574802 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.587602 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.615650 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e08b1134-79fa-4e19-9762-7315e271ff02-logs\") pod \"nova-metadata-0\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " pod="openstack/nova-metadata-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.615746 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c74n\" (UniqueName: \"kubernetes.io/projected/5a1237de-1bcc-4b1b-bff5-a775162f3ed9-kube-api-access-9c74n\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a1237de-1bcc-4b1b-bff5-a775162f3ed9\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.615774 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a1237de-1bcc-4b1b-bff5-a775162f3ed9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a1237de-1bcc-4b1b-bff5-a775162f3ed9\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.615802 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " pod="openstack/nova-metadata-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.615829 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " pod="openstack/nova-metadata-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.615871 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a1237de-1bcc-4b1b-bff5-a775162f3ed9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a1237de-1bcc-4b1b-bff5-a775162f3ed9\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.615899 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a1237de-1bcc-4b1b-bff5-a775162f3ed9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a1237de-1bcc-4b1b-bff5-a775162f3ed9\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.615949 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a1237de-1bcc-4b1b-bff5-a775162f3ed9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a1237de-1bcc-4b1b-bff5-a775162f3ed9\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.615971 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b58w\" (UniqueName: \"kubernetes.io/projected/e08b1134-79fa-4e19-9762-7315e271ff02-kube-api-access-9b58w\") pod \"nova-metadata-0\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " pod="openstack/nova-metadata-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.615993 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-config-data\") pod \"nova-metadata-0\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " pod="openstack/nova-metadata-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.717231 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e08b1134-79fa-4e19-9762-7315e271ff02-logs\") pod \"nova-metadata-0\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " pod="openstack/nova-metadata-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.717403 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c74n\" (UniqueName: \"kubernetes.io/projected/5a1237de-1bcc-4b1b-bff5-a775162f3ed9-kube-api-access-9c74n\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a1237de-1bcc-4b1b-bff5-a775162f3ed9\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.717455 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a1237de-1bcc-4b1b-bff5-a775162f3ed9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a1237de-1bcc-4b1b-bff5-a775162f3ed9\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.717517 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " pod="openstack/nova-metadata-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.717605 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " pod="openstack/nova-metadata-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.717680 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a1237de-1bcc-4b1b-bff5-a775162f3ed9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a1237de-1bcc-4b1b-bff5-a775162f3ed9\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.717736 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a1237de-1bcc-4b1b-bff5-a775162f3ed9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a1237de-1bcc-4b1b-bff5-a775162f3ed9\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.717787 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a1237de-1bcc-4b1b-bff5-a775162f3ed9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a1237de-1bcc-4b1b-bff5-a775162f3ed9\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.717834 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b58w\" (UniqueName: \"kubernetes.io/projected/e08b1134-79fa-4e19-9762-7315e271ff02-kube-api-access-9b58w\") pod \"nova-metadata-0\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " pod="openstack/nova-metadata-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.717890 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-config-data\") pod \"nova-metadata-0\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " pod="openstack/nova-metadata-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.718098 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e08b1134-79fa-4e19-9762-7315e271ff02-logs\") pod \"nova-metadata-0\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " pod="openstack/nova-metadata-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.727571 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " pod="openstack/nova-metadata-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.728090 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a1237de-1bcc-4b1b-bff5-a775162f3ed9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a1237de-1bcc-4b1b-bff5-a775162f3ed9\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.728895 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a1237de-1bcc-4b1b-bff5-a775162f3ed9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a1237de-1bcc-4b1b-bff5-a775162f3ed9\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.728949 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-config-data\") pod \"nova-metadata-0\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " pod="openstack/nova-metadata-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.730096 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a1237de-1bcc-4b1b-bff5-a775162f3ed9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a1237de-1bcc-4b1b-bff5-a775162f3ed9\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.738214 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " pod="openstack/nova-metadata-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.750739 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b58w\" (UniqueName: \"kubernetes.io/projected/e08b1134-79fa-4e19-9762-7315e271ff02-kube-api-access-9b58w\") pod \"nova-metadata-0\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " pod="openstack/nova-metadata-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.753413 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c74n\" (UniqueName: \"kubernetes.io/projected/5a1237de-1bcc-4b1b-bff5-a775162f3ed9-kube-api-access-9c74n\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a1237de-1bcc-4b1b-bff5-a775162f3ed9\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.755702 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a1237de-1bcc-4b1b-bff5-a775162f3ed9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5a1237de-1bcc-4b1b-bff5-a775162f3ed9\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.852042 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.852119 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.870406 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:33 crc kubenswrapper[4776]: I0128 07:10:33.887353 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 07:10:34 crc kubenswrapper[4776]: I0128 07:10:34.375497 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 07:10:34 crc kubenswrapper[4776]: I0128 07:10:34.401395 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e08b1134-79fa-4e19-9762-7315e271ff02","Type":"ContainerStarted","Data":"6e88819acfa0f786a387cf4fc95904812512579bba68a020f7142a7e187350e4"} Jan 28 07:10:34 crc kubenswrapper[4776]: I0128 07:10:34.436971 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 07:10:34 crc kubenswrapper[4776]: W0128 07:10:34.442768 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a1237de_1bcc_4b1b_bff5_a775162f3ed9.slice/crio-7a9811feba5689e84274cac5829390a9491a4b349e17b7ef4d99844760334921 WatchSource:0}: Error finding container 7a9811feba5689e84274cac5829390a9491a4b349e17b7ef4d99844760334921: Status 404 returned error can't find the container with id 7a9811feba5689e84274cac5829390a9491a4b349e17b7ef4d99844760334921 Jan 28 07:10:34 crc kubenswrapper[4776]: I0128 07:10:34.514571 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 28 07:10:34 crc kubenswrapper[4776]: I0128 07:10:34.515982 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 28 07:10:34 crc kubenswrapper[4776]: I0128 07:10:34.520973 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 28 07:10:34 crc kubenswrapper[4776]: I0128 07:10:34.524489 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.000849 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.048209 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-scripts\") pod \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.048279 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-combined-ca-bundle\") pod \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.048321 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-log-httpd\") pod \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.048411 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prgn4\" (UniqueName: \"kubernetes.io/projected/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-kube-api-access-prgn4\") pod \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.048454 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-config-data\") pod \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.048510 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-sg-core-conf-yaml\") pod \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.048675 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-run-httpd\") pod \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\" (UID: \"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e\") " Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.049136 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" (UID: "c0e63f1d-9670-4be6-ae47-f59ef27fdc2e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.049536 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" (UID: "c0e63f1d-9670-4be6-ae47-f59ef27fdc2e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.058097 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-scripts" (OuterVolumeSpecName: "scripts") pod "c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" (UID: "c0e63f1d-9670-4be6-ae47-f59ef27fdc2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.065127 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-kube-api-access-prgn4" (OuterVolumeSpecName: "kube-api-access-prgn4") pod "c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" (UID: "c0e63f1d-9670-4be6-ae47-f59ef27fdc2e"). InnerVolumeSpecName "kube-api-access-prgn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.114670 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" (UID: "c0e63f1d-9670-4be6-ae47-f59ef27fdc2e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.149146 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" (UID: "c0e63f1d-9670-4be6-ae47-f59ef27fdc2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.150454 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.150475 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.150484 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.150495 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.150503 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prgn4\" (UniqueName: \"kubernetes.io/projected/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-kube-api-access-prgn4\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.150514 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.177350 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-config-data" (OuterVolumeSpecName: "config-data") pod "c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" (UID: "c0e63f1d-9670-4be6-ae47-f59ef27fdc2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.251915 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.321636 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13487701-e252-4b0c-8991-df2864bcbde5" path="/var/lib/kubelet/pods/13487701-e252-4b0c-8991-df2864bcbde5/volumes" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.322390 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b496787c-32af-4ac1-b15d-814ee1abc744" path="/var/lib/kubelet/pods/b496787c-32af-4ac1-b15d-814ee1abc744/volumes" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.414152 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5a1237de-1bcc-4b1b-bff5-a775162f3ed9","Type":"ContainerStarted","Data":"525956070a10bbc18188778f660a775da051fcac9fade6136f5a4565a0060674"} Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.414636 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5a1237de-1bcc-4b1b-bff5-a775162f3ed9","Type":"ContainerStarted","Data":"7a9811feba5689e84274cac5829390a9491a4b349e17b7ef4d99844760334921"} Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.416371 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e08b1134-79fa-4e19-9762-7315e271ff02","Type":"ContainerStarted","Data":"23bc6ad9f0a018fec2acd772fed3628e675050bcd8beaf521843fdac42d028e2"} Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.416491 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e08b1134-79fa-4e19-9762-7315e271ff02","Type":"ContainerStarted","Data":"7db9e65bcc38a94b972fd7d9f72d7e5c61bc8a4cdca3a3fb5cdb59870149f8f9"} Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.422876 4776 generic.go:334] "Generic (PLEG): container finished" podID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerID="cfeeabf3a3202d8f5d081cb49d65dc7fb5054ef80e9717964236e58462c7c86f" exitCode=0 Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.424315 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.424906 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e","Type":"ContainerDied","Data":"cfeeabf3a3202d8f5d081cb49d65dc7fb5054ef80e9717964236e58462c7c86f"} Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.424945 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.424959 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0e63f1d-9670-4be6-ae47-f59ef27fdc2e","Type":"ContainerDied","Data":"5dee53da67b3f636aada3cf42b77077c5c60edaa010b5e9d9b5ae7ac99403f77"} Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.424978 4776 scope.go:117] "RemoveContainer" containerID="80a0e5d91eef9b50c58a931fe2e185b7dcfa3cf78580c301a919abbae7b3ec8d" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.430667 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.445896 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.445877665 podStartE2EDuration="2.445877665s" podCreationTimestamp="2026-01-28 07:10:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:10:35.430661144 +0000 UTC m=+1206.846321314" watchObservedRunningTime="2026-01-28 07:10:35.445877665 +0000 UTC m=+1206.861537835" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.478303 4776 scope.go:117] "RemoveContainer" containerID="acdfad40b8c2e5c3e509613efb471c522c15ea63768d75ba8b38aceb4087f71c" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.488915 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.488661661 podStartE2EDuration="2.488661661s" podCreationTimestamp="2026-01-28 07:10:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:10:35.484265252 +0000 UTC m=+1206.899925412" watchObservedRunningTime="2026-01-28 07:10:35.488661661 +0000 UTC m=+1206.904321821" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.546175 4776 scope.go:117] "RemoveContainer" containerID="cfeeabf3a3202d8f5d081cb49d65dc7fb5054ef80e9717964236e58462c7c86f" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.567611 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.615410 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.616716 4776 scope.go:117] "RemoveContainer" containerID="ccbac3e77006da25eafd54b1a36432127861753cd17b0a5e7c18593559bc9bcf" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.629844 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:10:35 crc kubenswrapper[4776]: E0128 07:10:35.630245 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerName="ceilometer-central-agent" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.630262 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerName="ceilometer-central-agent" Jan 28 07:10:35 crc kubenswrapper[4776]: E0128 07:10:35.630271 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerName="proxy-httpd" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.630278 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerName="proxy-httpd" Jan 28 07:10:35 crc kubenswrapper[4776]: E0128 07:10:35.630289 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerName="ceilometer-notification-agent" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.630296 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerName="ceilometer-notification-agent" Jan 28 07:10:35 crc kubenswrapper[4776]: E0128 07:10:35.630332 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerName="sg-core" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.630339 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerName="sg-core" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.630556 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerName="proxy-httpd" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.630573 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerName="ceilometer-notification-agent" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.630588 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerName="ceilometer-central-agent" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.630605 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" containerName="sg-core" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.632271 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.637415 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.637613 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.637791 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.645615 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.659353 4776 scope.go:117] "RemoveContainer" containerID="80a0e5d91eef9b50c58a931fe2e185b7dcfa3cf78580c301a919abbae7b3ec8d" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.662469 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-vcgfw"] Jan 28 07:10:35 crc kubenswrapper[4776]: E0128 07:10:35.663282 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80a0e5d91eef9b50c58a931fe2e185b7dcfa3cf78580c301a919abbae7b3ec8d\": container with ID starting with 80a0e5d91eef9b50c58a931fe2e185b7dcfa3cf78580c301a919abbae7b3ec8d not found: ID does not exist" containerID="80a0e5d91eef9b50c58a931fe2e185b7dcfa3cf78580c301a919abbae7b3ec8d" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.663335 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80a0e5d91eef9b50c58a931fe2e185b7dcfa3cf78580c301a919abbae7b3ec8d"} err="failed to get container status \"80a0e5d91eef9b50c58a931fe2e185b7dcfa3cf78580c301a919abbae7b3ec8d\": rpc error: code = NotFound desc = could not find container \"80a0e5d91eef9b50c58a931fe2e185b7dcfa3cf78580c301a919abbae7b3ec8d\": container with ID starting with 80a0e5d91eef9b50c58a931fe2e185b7dcfa3cf78580c301a919abbae7b3ec8d not found: ID does not exist" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.663359 4776 scope.go:117] "RemoveContainer" containerID="acdfad40b8c2e5c3e509613efb471c522c15ea63768d75ba8b38aceb4087f71c" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.665030 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: E0128 07:10:35.665250 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acdfad40b8c2e5c3e509613efb471c522c15ea63768d75ba8b38aceb4087f71c\": container with ID starting with acdfad40b8c2e5c3e509613efb471c522c15ea63768d75ba8b38aceb4087f71c not found: ID does not exist" containerID="acdfad40b8c2e5c3e509613efb471c522c15ea63768d75ba8b38aceb4087f71c" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.665289 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acdfad40b8c2e5c3e509613efb471c522c15ea63768d75ba8b38aceb4087f71c"} err="failed to get container status \"acdfad40b8c2e5c3e509613efb471c522c15ea63768d75ba8b38aceb4087f71c\": rpc error: code = NotFound desc = could not find container \"acdfad40b8c2e5c3e509613efb471c522c15ea63768d75ba8b38aceb4087f71c\": container with ID starting with acdfad40b8c2e5c3e509613efb471c522c15ea63768d75ba8b38aceb4087f71c not found: ID does not exist" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.665306 4776 scope.go:117] "RemoveContainer" containerID="cfeeabf3a3202d8f5d081cb49d65dc7fb5054ef80e9717964236e58462c7c86f" Jan 28 07:10:35 crc kubenswrapper[4776]: E0128 07:10:35.665526 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfeeabf3a3202d8f5d081cb49d65dc7fb5054ef80e9717964236e58462c7c86f\": container with ID starting with cfeeabf3a3202d8f5d081cb49d65dc7fb5054ef80e9717964236e58462c7c86f not found: ID does not exist" containerID="cfeeabf3a3202d8f5d081cb49d65dc7fb5054ef80e9717964236e58462c7c86f" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.665565 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfeeabf3a3202d8f5d081cb49d65dc7fb5054ef80e9717964236e58462c7c86f"} err="failed to get container status \"cfeeabf3a3202d8f5d081cb49d65dc7fb5054ef80e9717964236e58462c7c86f\": rpc error: code = NotFound desc = could not find container \"cfeeabf3a3202d8f5d081cb49d65dc7fb5054ef80e9717964236e58462c7c86f\": container with ID starting with cfeeabf3a3202d8f5d081cb49d65dc7fb5054ef80e9717964236e58462c7c86f not found: ID does not exist" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.665578 4776 scope.go:117] "RemoveContainer" containerID="ccbac3e77006da25eafd54b1a36432127861753cd17b0a5e7c18593559bc9bcf" Jan 28 07:10:35 crc kubenswrapper[4776]: E0128 07:10:35.665822 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccbac3e77006da25eafd54b1a36432127861753cd17b0a5e7c18593559bc9bcf\": container with ID starting with ccbac3e77006da25eafd54b1a36432127861753cd17b0a5e7c18593559bc9bcf not found: ID does not exist" containerID="ccbac3e77006da25eafd54b1a36432127861753cd17b0a5e7c18593559bc9bcf" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.665839 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccbac3e77006da25eafd54b1a36432127861753cd17b0a5e7c18593559bc9bcf"} err="failed to get container status \"ccbac3e77006da25eafd54b1a36432127861753cd17b0a5e7c18593559bc9bcf\": rpc error: code = NotFound desc = could not find container \"ccbac3e77006da25eafd54b1a36432127861753cd17b0a5e7c18593559bc9bcf\": container with ID starting with ccbac3e77006da25eafd54b1a36432127861753cd17b0a5e7c18593559bc9bcf not found: ID does not exist" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.676907 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-vcgfw"] Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.685930 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.686028 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl95h\" (UniqueName: \"kubernetes.io/projected/95856e13-d88c-4544-959f-a3d28346b0ef-kube-api-access-cl95h\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.686081 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-config-data\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.686147 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-scripts\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.686170 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.686199 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.686235 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5zp6\" (UniqueName: \"kubernetes.io/projected/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-kube-api-access-w5zp6\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.686273 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-config\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.686307 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95856e13-d88c-4544-959f-a3d28346b0ef-log-httpd\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.686324 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.686383 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95856e13-d88c-4544-959f-a3d28346b0ef-run-httpd\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.686399 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.686433 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.686558 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.788733 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.788788 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl95h\" (UniqueName: \"kubernetes.io/projected/95856e13-d88c-4544-959f-a3d28346b0ef-kube-api-access-cl95h\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.788816 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-config-data\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.788850 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-scripts\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.788867 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.788889 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.788913 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5zp6\" (UniqueName: \"kubernetes.io/projected/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-kube-api-access-w5zp6\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.788933 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-config\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.788954 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95856e13-d88c-4544-959f-a3d28346b0ef-log-httpd\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.788969 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.788998 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95856e13-d88c-4544-959f-a3d28346b0ef-run-httpd\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.789012 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.789032 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.789073 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.789925 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95856e13-d88c-4544-959f-a3d28346b0ef-log-httpd\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.790018 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95856e13-d88c-4544-959f-a3d28346b0ef-run-httpd\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.790778 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.791837 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.791884 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-config\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.792254 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.792260 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.793274 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.795493 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.796303 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.799141 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-scripts\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.799986 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-config-data\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.815138 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5zp6\" (UniqueName: \"kubernetes.io/projected/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-kube-api-access-w5zp6\") pod \"dnsmasq-dns-89c5cd4d5-vcgfw\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.818668 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl95h\" (UniqueName: \"kubernetes.io/projected/95856e13-d88c-4544-959f-a3d28346b0ef-kube-api-access-cl95h\") pod \"ceilometer-0\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.959102 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:10:35 crc kubenswrapper[4776]: I0128 07:10:35.988519 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:36 crc kubenswrapper[4776]: I0128 07:10:36.526095 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-vcgfw"] Jan 28 07:10:36 crc kubenswrapper[4776]: I0128 07:10:36.584952 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:10:36 crc kubenswrapper[4776]: W0128 07:10:36.593966 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95856e13_d88c_4544_959f_a3d28346b0ef.slice/crio-cab5fc92fc671b9fa8dd856d5656016462cad52872296d734ba430e33ff7d9e2 WatchSource:0}: Error finding container cab5fc92fc671b9fa8dd856d5656016462cad52872296d734ba430e33ff7d9e2: Status 404 returned error can't find the container with id cab5fc92fc671b9fa8dd856d5656016462cad52872296d734ba430e33ff7d9e2 Jan 28 07:10:37 crc kubenswrapper[4776]: I0128 07:10:37.318256 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0e63f1d-9670-4be6-ae47-f59ef27fdc2e" path="/var/lib/kubelet/pods/c0e63f1d-9670-4be6-ae47-f59ef27fdc2e/volumes" Jan 28 07:10:37 crc kubenswrapper[4776]: I0128 07:10:37.487627 4776 generic.go:334] "Generic (PLEG): container finished" podID="2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd" containerID="81815979acbeaf91af059a32c078f28c1c170bdb791e25b762d85d30d1f1caf9" exitCode=0 Jan 28 07:10:37 crc kubenswrapper[4776]: I0128 07:10:37.487788 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" event={"ID":"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd","Type":"ContainerDied","Data":"81815979acbeaf91af059a32c078f28c1c170bdb791e25b762d85d30d1f1caf9"} Jan 28 07:10:37 crc kubenswrapper[4776]: I0128 07:10:37.487987 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" event={"ID":"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd","Type":"ContainerStarted","Data":"0d8e615026bf861c6241b9fbd58646a8c68d6050d190b4b22ad2ab1bd027f991"} Jan 28 07:10:37 crc kubenswrapper[4776]: I0128 07:10:37.491773 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95856e13-d88c-4544-959f-a3d28346b0ef","Type":"ContainerStarted","Data":"348a6e05a4815ebd00a8a79860c0672a83948db9951a31c1eb3b590317f19f7b"} Jan 28 07:10:37 crc kubenswrapper[4776]: I0128 07:10:37.491835 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95856e13-d88c-4544-959f-a3d28346b0ef","Type":"ContainerStarted","Data":"cab5fc92fc671b9fa8dd856d5656016462cad52872296d734ba430e33ff7d9e2"} Jan 28 07:10:37 crc kubenswrapper[4776]: I0128 07:10:37.689723 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:10:38 crc kubenswrapper[4776]: I0128 07:10:38.092421 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 07:10:38 crc kubenswrapper[4776]: I0128 07:10:38.512367 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" event={"ID":"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd","Type":"ContainerStarted","Data":"be8eb4a207c189783b7980b93eca23cb7839a42fba29d1034023a31736a9300e"} Jan 28 07:10:38 crc kubenswrapper[4776]: I0128 07:10:38.512966 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:38 crc kubenswrapper[4776]: I0128 07:10:38.515016 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="30578448-fa5a-4aca-9318-9eb78cc0a9ca" containerName="nova-api-log" containerID="cri-o://717ee8b2af62b71284befbcaa5e4828d54c5621883195ceef360ca0b60809d2f" gracePeriod=30 Jan 28 07:10:38 crc kubenswrapper[4776]: I0128 07:10:38.515095 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95856e13-d88c-4544-959f-a3d28346b0ef","Type":"ContainerStarted","Data":"9c52a0d602820b67b5a7ee93eb456be265980e38ba4b5447d30e9560beaa0ca1"} Jan 28 07:10:38 crc kubenswrapper[4776]: I0128 07:10:38.515148 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="30578448-fa5a-4aca-9318-9eb78cc0a9ca" containerName="nova-api-api" containerID="cri-o://1735ec56e5ab92b5a3534447e80bebbbb91f1b575146a7e90a7d8948049553a1" gracePeriod=30 Jan 28 07:10:38 crc kubenswrapper[4776]: I0128 07:10:38.539948 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" podStartSLOduration=3.5399287299999997 podStartE2EDuration="3.53992873s" podCreationTimestamp="2026-01-28 07:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:10:38.531608486 +0000 UTC m=+1209.947268646" watchObservedRunningTime="2026-01-28 07:10:38.53992873 +0000 UTC m=+1209.955588890" Jan 28 07:10:38 crc kubenswrapper[4776]: I0128 07:10:38.870522 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:38 crc kubenswrapper[4776]: I0128 07:10:38.887528 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 07:10:38 crc kubenswrapper[4776]: I0128 07:10:38.887603 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 07:10:39 crc kubenswrapper[4776]: I0128 07:10:39.031498 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 28 07:10:39 crc kubenswrapper[4776]: I0128 07:10:39.539504 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95856e13-d88c-4544-959f-a3d28346b0ef","Type":"ContainerStarted","Data":"c720c04389fda0e82cedc90e2414fe31334a4c299f39a9a801fdeacad900f1d9"} Jan 28 07:10:39 crc kubenswrapper[4776]: I0128 07:10:39.548105 4776 generic.go:334] "Generic (PLEG): container finished" podID="30578448-fa5a-4aca-9318-9eb78cc0a9ca" containerID="717ee8b2af62b71284befbcaa5e4828d54c5621883195ceef360ca0b60809d2f" exitCode=143 Jan 28 07:10:39 crc kubenswrapper[4776]: I0128 07:10:39.548703 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30578448-fa5a-4aca-9318-9eb78cc0a9ca","Type":"ContainerDied","Data":"717ee8b2af62b71284befbcaa5e4828d54c5621883195ceef360ca0b60809d2f"} Jan 28 07:10:40 crc kubenswrapper[4776]: I0128 07:10:40.561333 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95856e13-d88c-4544-959f-a3d28346b0ef","Type":"ContainerStarted","Data":"befe3a9224afd3d6a5d7170057eca386d4d9340503320d60f75d312188731c63"} Jan 28 07:10:40 crc kubenswrapper[4776]: I0128 07:10:40.561818 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95856e13-d88c-4544-959f-a3d28346b0ef" containerName="ceilometer-central-agent" containerID="cri-o://348a6e05a4815ebd00a8a79860c0672a83948db9951a31c1eb3b590317f19f7b" gracePeriod=30 Jan 28 07:10:40 crc kubenswrapper[4776]: I0128 07:10:40.562090 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95856e13-d88c-4544-959f-a3d28346b0ef" containerName="proxy-httpd" containerID="cri-o://befe3a9224afd3d6a5d7170057eca386d4d9340503320d60f75d312188731c63" gracePeriod=30 Jan 28 07:10:40 crc kubenswrapper[4776]: I0128 07:10:40.562192 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95856e13-d88c-4544-959f-a3d28346b0ef" containerName="ceilometer-notification-agent" containerID="cri-o://9c52a0d602820b67b5a7ee93eb456be265980e38ba4b5447d30e9560beaa0ca1" gracePeriod=30 Jan 28 07:10:40 crc kubenswrapper[4776]: I0128 07:10:40.562240 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95856e13-d88c-4544-959f-a3d28346b0ef" containerName="sg-core" containerID="cri-o://c720c04389fda0e82cedc90e2414fe31334a4c299f39a9a801fdeacad900f1d9" gracePeriod=30 Jan 28 07:10:40 crc kubenswrapper[4776]: I0128 07:10:40.562391 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 07:10:40 crc kubenswrapper[4776]: I0128 07:10:40.609017 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.176789121 podStartE2EDuration="5.608994362s" podCreationTimestamp="2026-01-28 07:10:35 +0000 UTC" firstStartedPulling="2026-01-28 07:10:36.604978862 +0000 UTC m=+1208.020639022" lastFinishedPulling="2026-01-28 07:10:40.037184113 +0000 UTC m=+1211.452844263" observedRunningTime="2026-01-28 07:10:40.598033436 +0000 UTC m=+1212.013693606" watchObservedRunningTime="2026-01-28 07:10:40.608994362 +0000 UTC m=+1212.024654542" Jan 28 07:10:41 crc kubenswrapper[4776]: I0128 07:10:41.573350 4776 generic.go:334] "Generic (PLEG): container finished" podID="95856e13-d88c-4544-959f-a3d28346b0ef" containerID="befe3a9224afd3d6a5d7170057eca386d4d9340503320d60f75d312188731c63" exitCode=0 Jan 28 07:10:41 crc kubenswrapper[4776]: I0128 07:10:41.573390 4776 generic.go:334] "Generic (PLEG): container finished" podID="95856e13-d88c-4544-959f-a3d28346b0ef" containerID="c720c04389fda0e82cedc90e2414fe31334a4c299f39a9a801fdeacad900f1d9" exitCode=2 Jan 28 07:10:41 crc kubenswrapper[4776]: I0128 07:10:41.573401 4776 generic.go:334] "Generic (PLEG): container finished" podID="95856e13-d88c-4544-959f-a3d28346b0ef" containerID="9c52a0d602820b67b5a7ee93eb456be265980e38ba4b5447d30e9560beaa0ca1" exitCode=0 Jan 28 07:10:41 crc kubenswrapper[4776]: I0128 07:10:41.573424 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95856e13-d88c-4544-959f-a3d28346b0ef","Type":"ContainerDied","Data":"befe3a9224afd3d6a5d7170057eca386d4d9340503320d60f75d312188731c63"} Jan 28 07:10:41 crc kubenswrapper[4776]: I0128 07:10:41.573453 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95856e13-d88c-4544-959f-a3d28346b0ef","Type":"ContainerDied","Data":"c720c04389fda0e82cedc90e2414fe31334a4c299f39a9a801fdeacad900f1d9"} Jan 28 07:10:41 crc kubenswrapper[4776]: I0128 07:10:41.573467 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95856e13-d88c-4544-959f-a3d28346b0ef","Type":"ContainerDied","Data":"9c52a0d602820b67b5a7ee93eb456be265980e38ba4b5447d30e9560beaa0ca1"} Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.169787 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.344930 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30578448-fa5a-4aca-9318-9eb78cc0a9ca-combined-ca-bundle\") pod \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\" (UID: \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\") " Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.345024 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30578448-fa5a-4aca-9318-9eb78cc0a9ca-logs\") pod \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\" (UID: \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\") " Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.345131 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rgq5\" (UniqueName: \"kubernetes.io/projected/30578448-fa5a-4aca-9318-9eb78cc0a9ca-kube-api-access-6rgq5\") pod \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\" (UID: \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\") " Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.345261 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30578448-fa5a-4aca-9318-9eb78cc0a9ca-config-data\") pod \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\" (UID: \"30578448-fa5a-4aca-9318-9eb78cc0a9ca\") " Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.345933 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30578448-fa5a-4aca-9318-9eb78cc0a9ca-logs" (OuterVolumeSpecName: "logs") pod "30578448-fa5a-4aca-9318-9eb78cc0a9ca" (UID: "30578448-fa5a-4aca-9318-9eb78cc0a9ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.377205 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30578448-fa5a-4aca-9318-9eb78cc0a9ca-kube-api-access-6rgq5" (OuterVolumeSpecName: "kube-api-access-6rgq5") pod "30578448-fa5a-4aca-9318-9eb78cc0a9ca" (UID: "30578448-fa5a-4aca-9318-9eb78cc0a9ca"). InnerVolumeSpecName "kube-api-access-6rgq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.453760 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30578448-fa5a-4aca-9318-9eb78cc0a9ca-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.453797 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rgq5\" (UniqueName: \"kubernetes.io/projected/30578448-fa5a-4aca-9318-9eb78cc0a9ca-kube-api-access-6rgq5\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.475712 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30578448-fa5a-4aca-9318-9eb78cc0a9ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30578448-fa5a-4aca-9318-9eb78cc0a9ca" (UID: "30578448-fa5a-4aca-9318-9eb78cc0a9ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.526389 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30578448-fa5a-4aca-9318-9eb78cc0a9ca-config-data" (OuterVolumeSpecName: "config-data") pod "30578448-fa5a-4aca-9318-9eb78cc0a9ca" (UID: "30578448-fa5a-4aca-9318-9eb78cc0a9ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.556407 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30578448-fa5a-4aca-9318-9eb78cc0a9ca-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.556442 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30578448-fa5a-4aca-9318-9eb78cc0a9ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.589832 4776 generic.go:334] "Generic (PLEG): container finished" podID="30578448-fa5a-4aca-9318-9eb78cc0a9ca" containerID="1735ec56e5ab92b5a3534447e80bebbbb91f1b575146a7e90a7d8948049553a1" exitCode=0 Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.589907 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30578448-fa5a-4aca-9318-9eb78cc0a9ca","Type":"ContainerDied","Data":"1735ec56e5ab92b5a3534447e80bebbbb91f1b575146a7e90a7d8948049553a1"} Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.589963 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30578448-fa5a-4aca-9318-9eb78cc0a9ca","Type":"ContainerDied","Data":"4093f3114711ff28330bfa58a4f9776ca3e3725be93f2c7afca80fe2a5457ff9"} Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.589988 4776 scope.go:117] "RemoveContainer" containerID="1735ec56e5ab92b5a3534447e80bebbbb91f1b575146a7e90a7d8948049553a1" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.590245 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.627111 4776 scope.go:117] "RemoveContainer" containerID="717ee8b2af62b71284befbcaa5e4828d54c5621883195ceef360ca0b60809d2f" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.657228 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.668669 4776 scope.go:117] "RemoveContainer" containerID="1735ec56e5ab92b5a3534447e80bebbbb91f1b575146a7e90a7d8948049553a1" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.668773 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 28 07:10:42 crc kubenswrapper[4776]: E0128 07:10:42.669132 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1735ec56e5ab92b5a3534447e80bebbbb91f1b575146a7e90a7d8948049553a1\": container with ID starting with 1735ec56e5ab92b5a3534447e80bebbbb91f1b575146a7e90a7d8948049553a1 not found: ID does not exist" containerID="1735ec56e5ab92b5a3534447e80bebbbb91f1b575146a7e90a7d8948049553a1" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.669359 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1735ec56e5ab92b5a3534447e80bebbbb91f1b575146a7e90a7d8948049553a1"} err="failed to get container status \"1735ec56e5ab92b5a3534447e80bebbbb91f1b575146a7e90a7d8948049553a1\": rpc error: code = NotFound desc = could not find container \"1735ec56e5ab92b5a3534447e80bebbbb91f1b575146a7e90a7d8948049553a1\": container with ID starting with 1735ec56e5ab92b5a3534447e80bebbbb91f1b575146a7e90a7d8948049553a1 not found: ID does not exist" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.690221 4776 scope.go:117] "RemoveContainer" containerID="717ee8b2af62b71284befbcaa5e4828d54c5621883195ceef360ca0b60809d2f" Jan 28 07:10:42 crc kubenswrapper[4776]: E0128 07:10:42.690716 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"717ee8b2af62b71284befbcaa5e4828d54c5621883195ceef360ca0b60809d2f\": container with ID starting with 717ee8b2af62b71284befbcaa5e4828d54c5621883195ceef360ca0b60809d2f not found: ID does not exist" containerID="717ee8b2af62b71284befbcaa5e4828d54c5621883195ceef360ca0b60809d2f" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.690757 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717ee8b2af62b71284befbcaa5e4828d54c5621883195ceef360ca0b60809d2f"} err="failed to get container status \"717ee8b2af62b71284befbcaa5e4828d54c5621883195ceef360ca0b60809d2f\": rpc error: code = NotFound desc = could not find container \"717ee8b2af62b71284befbcaa5e4828d54c5621883195ceef360ca0b60809d2f\": container with ID starting with 717ee8b2af62b71284befbcaa5e4828d54c5621883195ceef360ca0b60809d2f not found: ID does not exist" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.700577 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 28 07:10:42 crc kubenswrapper[4776]: E0128 07:10:42.701077 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30578448-fa5a-4aca-9318-9eb78cc0a9ca" containerName="nova-api-api" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.701097 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="30578448-fa5a-4aca-9318-9eb78cc0a9ca" containerName="nova-api-api" Jan 28 07:10:42 crc kubenswrapper[4776]: E0128 07:10:42.701109 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30578448-fa5a-4aca-9318-9eb78cc0a9ca" containerName="nova-api-log" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.701118 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="30578448-fa5a-4aca-9318-9eb78cc0a9ca" containerName="nova-api-log" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.701354 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="30578448-fa5a-4aca-9318-9eb78cc0a9ca" containerName="nova-api-log" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.701381 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="30578448-fa5a-4aca-9318-9eb78cc0a9ca" containerName="nova-api-api" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.702578 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.705948 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.706112 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.709755 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.711340 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.861811 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.861915 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.861943 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-config-data\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.862137 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzw6n\" (UniqueName: \"kubernetes.io/projected/1a38d0cc-0e43-43ae-9710-9689abdfcb15-kube-api-access-pzw6n\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.862198 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a38d0cc-0e43-43ae-9710-9689abdfcb15-logs\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.862230 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-public-tls-certs\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.963519 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.963573 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-config-data\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.963643 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzw6n\" (UniqueName: \"kubernetes.io/projected/1a38d0cc-0e43-43ae-9710-9689abdfcb15-kube-api-access-pzw6n\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.963669 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a38d0cc-0e43-43ae-9710-9689abdfcb15-logs\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.963684 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-public-tls-certs\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.963731 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.964228 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a38d0cc-0e43-43ae-9710-9689abdfcb15-logs\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.968819 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.969021 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.969236 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-config-data\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.969434 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-public-tls-certs\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:42 crc kubenswrapper[4776]: I0128 07:10:42.982012 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzw6n\" (UniqueName: \"kubernetes.io/projected/1a38d0cc-0e43-43ae-9710-9689abdfcb15-kube-api-access-pzw6n\") pod \"nova-api-0\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " pod="openstack/nova-api-0" Jan 28 07:10:43 crc kubenswrapper[4776]: I0128 07:10:43.019212 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 07:10:43 crc kubenswrapper[4776]: I0128 07:10:43.318850 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30578448-fa5a-4aca-9318-9eb78cc0a9ca" path="/var/lib/kubelet/pods/30578448-fa5a-4aca-9318-9eb78cc0a9ca/volumes" Jan 28 07:10:43 crc kubenswrapper[4776]: I0128 07:10:43.506890 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 07:10:43 crc kubenswrapper[4776]: I0128 07:10:43.609381 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a38d0cc-0e43-43ae-9710-9689abdfcb15","Type":"ContainerStarted","Data":"3e1b85fc3544288b817da43e71db291a83ed0a0643f48eeeb0a51bcffc8e5b86"} Jan 28 07:10:43 crc kubenswrapper[4776]: I0128 07:10:43.871333 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:43 crc kubenswrapper[4776]: I0128 07:10:43.887847 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 28 07:10:43 crc kubenswrapper[4776]: I0128 07:10:43.887890 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 28 07:10:43 crc kubenswrapper[4776]: I0128 07:10:43.892844 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.622045 4776 generic.go:334] "Generic (PLEG): container finished" podID="95856e13-d88c-4544-959f-a3d28346b0ef" containerID="348a6e05a4815ebd00a8a79860c0672a83948db9951a31c1eb3b590317f19f7b" exitCode=0 Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.622168 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95856e13-d88c-4544-959f-a3d28346b0ef","Type":"ContainerDied","Data":"348a6e05a4815ebd00a8a79860c0672a83948db9951a31c1eb3b590317f19f7b"} Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.622424 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95856e13-d88c-4544-959f-a3d28346b0ef","Type":"ContainerDied","Data":"cab5fc92fc671b9fa8dd856d5656016462cad52872296d734ba430e33ff7d9e2"} Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.622438 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cab5fc92fc671b9fa8dd856d5656016462cad52872296d734ba430e33ff7d9e2" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.626000 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a38d0cc-0e43-43ae-9710-9689abdfcb15","Type":"ContainerStarted","Data":"006b9ca974fc85af3c882de1d9c8076b70bcf1f371077c47128aa0cf674cab12"} Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.626035 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a38d0cc-0e43-43ae-9710-9689abdfcb15","Type":"ContainerStarted","Data":"bd5ff175056341e11cca328978de4dfdeeed2409acf042a0709e21112a44f788"} Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.646957 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.650665 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6506398 podStartE2EDuration="2.6506398s" podCreationTimestamp="2026-01-28 07:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:10:44.640254569 +0000 UTC m=+1216.055914729" watchObservedRunningTime="2026-01-28 07:10:44.6506398 +0000 UTC m=+1216.066299970" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.660965 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.808478 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-combined-ca-bundle\") pod \"95856e13-d88c-4544-959f-a3d28346b0ef\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.808753 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-sg-core-conf-yaml\") pod \"95856e13-d88c-4544-959f-a3d28346b0ef\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.808870 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-ceilometer-tls-certs\") pod \"95856e13-d88c-4544-959f-a3d28346b0ef\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.809028 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95856e13-d88c-4544-959f-a3d28346b0ef-run-httpd\") pod \"95856e13-d88c-4544-959f-a3d28346b0ef\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.809137 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-config-data\") pod \"95856e13-d88c-4544-959f-a3d28346b0ef\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.809240 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl95h\" (UniqueName: \"kubernetes.io/projected/95856e13-d88c-4544-959f-a3d28346b0ef-kube-api-access-cl95h\") pod \"95856e13-d88c-4544-959f-a3d28346b0ef\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.809350 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-scripts\") pod \"95856e13-d88c-4544-959f-a3d28346b0ef\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.809462 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95856e13-d88c-4544-959f-a3d28346b0ef-log-httpd\") pod \"95856e13-d88c-4544-959f-a3d28346b0ef\" (UID: \"95856e13-d88c-4544-959f-a3d28346b0ef\") " Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.811235 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95856e13-d88c-4544-959f-a3d28346b0ef-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "95856e13-d88c-4544-959f-a3d28346b0ef" (UID: "95856e13-d88c-4544-959f-a3d28346b0ef"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.812265 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95856e13-d88c-4544-959f-a3d28346b0ef-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "95856e13-d88c-4544-959f-a3d28346b0ef" (UID: "95856e13-d88c-4544-959f-a3d28346b0ef"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.819880 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-t2477"] Jan 28 07:10:44 crc kubenswrapper[4776]: E0128 07:10:44.820463 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95856e13-d88c-4544-959f-a3d28346b0ef" containerName="sg-core" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.820532 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="95856e13-d88c-4544-959f-a3d28346b0ef" containerName="sg-core" Jan 28 07:10:44 crc kubenswrapper[4776]: E0128 07:10:44.820630 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95856e13-d88c-4544-959f-a3d28346b0ef" containerName="ceilometer-notification-agent" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.820687 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="95856e13-d88c-4544-959f-a3d28346b0ef" containerName="ceilometer-notification-agent" Jan 28 07:10:44 crc kubenswrapper[4776]: E0128 07:10:44.820762 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95856e13-d88c-4544-959f-a3d28346b0ef" containerName="ceilometer-central-agent" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.820831 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="95856e13-d88c-4544-959f-a3d28346b0ef" containerName="ceilometer-central-agent" Jan 28 07:10:44 crc kubenswrapper[4776]: E0128 07:10:44.820911 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95856e13-d88c-4544-959f-a3d28346b0ef" containerName="proxy-httpd" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.820984 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="95856e13-d88c-4544-959f-a3d28346b0ef" containerName="proxy-httpd" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.821219 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="95856e13-d88c-4544-959f-a3d28346b0ef" containerName="proxy-httpd" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.821283 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="95856e13-d88c-4544-959f-a3d28346b0ef" containerName="sg-core" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.821348 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="95856e13-d88c-4544-959f-a3d28346b0ef" containerName="ceilometer-central-agent" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.821415 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="95856e13-d88c-4544-959f-a3d28346b0ef" containerName="ceilometer-notification-agent" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.822188 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-t2477" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.826141 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.826326 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.830511 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-scripts" (OuterVolumeSpecName: "scripts") pod "95856e13-d88c-4544-959f-a3d28346b0ef" (UID: "95856e13-d88c-4544-959f-a3d28346b0ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.839388 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95856e13-d88c-4544-959f-a3d28346b0ef-kube-api-access-cl95h" (OuterVolumeSpecName: "kube-api-access-cl95h") pod "95856e13-d88c-4544-959f-a3d28346b0ef" (UID: "95856e13-d88c-4544-959f-a3d28346b0ef"). InnerVolumeSpecName "kube-api-access-cl95h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.844388 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-t2477"] Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.874652 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "95856e13-d88c-4544-959f-a3d28346b0ef" (UID: "95856e13-d88c-4544-959f-a3d28346b0ef"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.878009 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "95856e13-d88c-4544-959f-a3d28346b0ef" (UID: "95856e13-d88c-4544-959f-a3d28346b0ef"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.892835 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e08b1134-79fa-4e19-9762-7315e271ff02" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.897863 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e08b1134-79fa-4e19-9762-7315e271ff02" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.911339 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.911368 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95856e13-d88c-4544-959f-a3d28346b0ef-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.911377 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.911387 4776 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.911395 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95856e13-d88c-4544-959f-a3d28346b0ef-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.911404 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl95h\" (UniqueName: \"kubernetes.io/projected/95856e13-d88c-4544-959f-a3d28346b0ef-kube-api-access-cl95h\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.938706 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95856e13-d88c-4544-959f-a3d28346b0ef" (UID: "95856e13-d88c-4544-959f-a3d28346b0ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:44 crc kubenswrapper[4776]: I0128 07:10:44.978235 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-config-data" (OuterVolumeSpecName: "config-data") pod "95856e13-d88c-4544-959f-a3d28346b0ef" (UID: "95856e13-d88c-4544-959f-a3d28346b0ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.003824 4776 scope.go:117] "RemoveContainer" containerID="d5f02d16d9d027560160966c6bf65cd53e40009aab9b6b04f1102162b4ff7302" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.015936 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-config-data\") pod \"nova-cell1-cell-mapping-t2477\" (UID: \"2c9c2ccb-51d7-4307-8536-11657989c02d\") " pod="openstack/nova-cell1-cell-mapping-t2477" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.016001 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-scripts\") pod \"nova-cell1-cell-mapping-t2477\" (UID: \"2c9c2ccb-51d7-4307-8536-11657989c02d\") " pod="openstack/nova-cell1-cell-mapping-t2477" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.016044 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5jz2\" (UniqueName: \"kubernetes.io/projected/2c9c2ccb-51d7-4307-8536-11657989c02d-kube-api-access-p5jz2\") pod \"nova-cell1-cell-mapping-t2477\" (UID: \"2c9c2ccb-51d7-4307-8536-11657989c02d\") " pod="openstack/nova-cell1-cell-mapping-t2477" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.016143 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-t2477\" (UID: \"2c9c2ccb-51d7-4307-8536-11657989c02d\") " pod="openstack/nova-cell1-cell-mapping-t2477" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.016227 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.016238 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95856e13-d88c-4544-959f-a3d28346b0ef-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.039793 4776 scope.go:117] "RemoveContainer" containerID="26c1327c61fa8f8e69913397bbcb81df98863c71b4e09664fca2eceb40888e59" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.071835 4776 scope.go:117] "RemoveContainer" containerID="1bf2c25d96bfb443c0d64a331ff147fd56e0a1e88b358ecb63ce83150c4fa11d" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.117911 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-t2477\" (UID: \"2c9c2ccb-51d7-4307-8536-11657989c02d\") " pod="openstack/nova-cell1-cell-mapping-t2477" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.117997 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-config-data\") pod \"nova-cell1-cell-mapping-t2477\" (UID: \"2c9c2ccb-51d7-4307-8536-11657989c02d\") " pod="openstack/nova-cell1-cell-mapping-t2477" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.118028 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-scripts\") pod \"nova-cell1-cell-mapping-t2477\" (UID: \"2c9c2ccb-51d7-4307-8536-11657989c02d\") " pod="openstack/nova-cell1-cell-mapping-t2477" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.118063 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5jz2\" (UniqueName: \"kubernetes.io/projected/2c9c2ccb-51d7-4307-8536-11657989c02d-kube-api-access-p5jz2\") pod \"nova-cell1-cell-mapping-t2477\" (UID: \"2c9c2ccb-51d7-4307-8536-11657989c02d\") " pod="openstack/nova-cell1-cell-mapping-t2477" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.122649 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-scripts\") pod \"nova-cell1-cell-mapping-t2477\" (UID: \"2c9c2ccb-51d7-4307-8536-11657989c02d\") " pod="openstack/nova-cell1-cell-mapping-t2477" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.123055 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-config-data\") pod \"nova-cell1-cell-mapping-t2477\" (UID: \"2c9c2ccb-51d7-4307-8536-11657989c02d\") " pod="openstack/nova-cell1-cell-mapping-t2477" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.125098 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-t2477\" (UID: \"2c9c2ccb-51d7-4307-8536-11657989c02d\") " pod="openstack/nova-cell1-cell-mapping-t2477" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.134834 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5jz2\" (UniqueName: \"kubernetes.io/projected/2c9c2ccb-51d7-4307-8536-11657989c02d-kube-api-access-p5jz2\") pod \"nova-cell1-cell-mapping-t2477\" (UID: \"2c9c2ccb-51d7-4307-8536-11657989c02d\") " pod="openstack/nova-cell1-cell-mapping-t2477" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.193451 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-t2477" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.646522 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.695178 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.707218 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.716147 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.718870 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.727252 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.727486 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.727768 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.734269 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.843177 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-t2477"] Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.862829 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.862923 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bd6t\" (UniqueName: \"kubernetes.io/projected/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-kube-api-access-7bd6t\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.862987 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-config-data\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.863126 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-log-httpd\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.863199 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-run-httpd\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.863396 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.863453 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.863534 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-scripts\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.966760 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-log-httpd\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.966897 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-run-httpd\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.967031 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.967089 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.967149 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-scripts\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.967189 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.967256 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bd6t\" (UniqueName: \"kubernetes.io/projected/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-kube-api-access-7bd6t\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.967328 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-config-data\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.967843 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-log-httpd\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.968875 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-run-httpd\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.973293 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.973976 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.975043 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-config-data\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.976906 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-scripts\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.977264 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.990775 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:10:45 crc kubenswrapper[4776]: I0128 07:10:45.996186 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bd6t\" (UniqueName: \"kubernetes.io/projected/259f2bd2-3855-4ebb-8eeb-1457a26c74ae-kube-api-access-7bd6t\") pod \"ceilometer-0\" (UID: \"259f2bd2-3855-4ebb-8eeb-1457a26c74ae\") " pod="openstack/ceilometer-0" Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.039792 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.143109 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ckcck"] Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.143467 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-ckcck" podUID="af39015e-a0f9-4ebd-b6c5-865f3081b2da" containerName="dnsmasq-dns" containerID="cri-o://049705c2e9b72b399b18a24b9a710269ecb8af45f04cd1b326c3db1711c29dde" gracePeriod=10 Jan 28 07:10:46 crc kubenswrapper[4776]: W0128 07:10:46.593263 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod259f2bd2_3855_4ebb_8eeb_1457a26c74ae.slice/crio-10795673fd8898507a6e745087b941d602634269e0736f57d10aa59c7780d995 WatchSource:0}: Error finding container 10795673fd8898507a6e745087b941d602634269e0736f57d10aa59c7780d995: Status 404 returned error can't find the container with id 10795673fd8898507a6e745087b941d602634269e0736f57d10aa59c7780d995 Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.593886 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.662382 4776 generic.go:334] "Generic (PLEG): container finished" podID="af39015e-a0f9-4ebd-b6c5-865f3081b2da" containerID="049705c2e9b72b399b18a24b9a710269ecb8af45f04cd1b326c3db1711c29dde" exitCode=0 Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.662437 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-ckcck" event={"ID":"af39015e-a0f9-4ebd-b6c5-865f3081b2da","Type":"ContainerDied","Data":"049705c2e9b72b399b18a24b9a710269ecb8af45f04cd1b326c3db1711c29dde"} Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.662461 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-ckcck" event={"ID":"af39015e-a0f9-4ebd-b6c5-865f3081b2da","Type":"ContainerDied","Data":"5e634e178bf659875670ce7ea5a32b637a685031c89f9f55f9c28b98a803562c"} Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.662471 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e634e178bf659875670ce7ea5a32b637a685031c89f9f55f9c28b98a803562c" Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.663910 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-t2477" event={"ID":"2c9c2ccb-51d7-4307-8536-11657989c02d","Type":"ContainerStarted","Data":"2a2fcc8f96b985d910f5b4e5f353da1467c1090ce018f09c73fe3b6641d262f4"} Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.663934 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-t2477" event={"ID":"2c9c2ccb-51d7-4307-8536-11657989c02d","Type":"ContainerStarted","Data":"5e65a6fa73dced4c02e1f49be0c6d1d0df23f9b4491c54b96749a1032c3a9bc1"} Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.667784 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"259f2bd2-3855-4ebb-8eeb-1457a26c74ae","Type":"ContainerStarted","Data":"10795673fd8898507a6e745087b941d602634269e0736f57d10aa59c7780d995"} Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.684268 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.688597 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-t2477" podStartSLOduration=2.688576492 podStartE2EDuration="2.688576492s" podCreationTimestamp="2026-01-28 07:10:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:10:46.682099376 +0000 UTC m=+1218.097759536" watchObservedRunningTime="2026-01-28 07:10:46.688576492 +0000 UTC m=+1218.104236642" Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.804005 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-config\") pod \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.804044 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-dns-svc\") pod \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.804062 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-ovsdbserver-nb\") pod \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.804123 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc86m\" (UniqueName: \"kubernetes.io/projected/af39015e-a0f9-4ebd-b6c5-865f3081b2da-kube-api-access-tc86m\") pod \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.804206 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-ovsdbserver-sb\") pod \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.804316 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-dns-swift-storage-0\") pod \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\" (UID: \"af39015e-a0f9-4ebd-b6c5-865f3081b2da\") " Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.816180 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af39015e-a0f9-4ebd-b6c5-865f3081b2da-kube-api-access-tc86m" (OuterVolumeSpecName: "kube-api-access-tc86m") pod "af39015e-a0f9-4ebd-b6c5-865f3081b2da" (UID: "af39015e-a0f9-4ebd-b6c5-865f3081b2da"). InnerVolumeSpecName "kube-api-access-tc86m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.866739 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "af39015e-a0f9-4ebd-b6c5-865f3081b2da" (UID: "af39015e-a0f9-4ebd-b6c5-865f3081b2da"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.877400 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-config" (OuterVolumeSpecName: "config") pod "af39015e-a0f9-4ebd-b6c5-865f3081b2da" (UID: "af39015e-a0f9-4ebd-b6c5-865f3081b2da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.886478 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "af39015e-a0f9-4ebd-b6c5-865f3081b2da" (UID: "af39015e-a0f9-4ebd-b6c5-865f3081b2da"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.899118 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "af39015e-a0f9-4ebd-b6c5-865f3081b2da" (UID: "af39015e-a0f9-4ebd-b6c5-865f3081b2da"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.900494 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "af39015e-a0f9-4ebd-b6c5-865f3081b2da" (UID: "af39015e-a0f9-4ebd-b6c5-865f3081b2da"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.906054 4776 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.906087 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.906098 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.906107 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.906116 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc86m\" (UniqueName: \"kubernetes.io/projected/af39015e-a0f9-4ebd-b6c5-865f3081b2da-kube-api-access-tc86m\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:46 crc kubenswrapper[4776]: I0128 07:10:46.906126 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af39015e-a0f9-4ebd-b6c5-865f3081b2da-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:47 crc kubenswrapper[4776]: I0128 07:10:47.324687 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95856e13-d88c-4544-959f-a3d28346b0ef" path="/var/lib/kubelet/pods/95856e13-d88c-4544-959f-a3d28346b0ef/volumes" Jan 28 07:10:47 crc kubenswrapper[4776]: I0128 07:10:47.678129 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"259f2bd2-3855-4ebb-8eeb-1457a26c74ae","Type":"ContainerStarted","Data":"85bf1b031c20a10bc884e5ad972e76501375f2b9e7235cc05cc6a9d862ff55bd"} Jan 28 07:10:47 crc kubenswrapper[4776]: I0128 07:10:47.678214 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-ckcck" Jan 28 07:10:47 crc kubenswrapper[4776]: I0128 07:10:47.708302 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ckcck"] Jan 28 07:10:47 crc kubenswrapper[4776]: I0128 07:10:47.721194 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ckcck"] Jan 28 07:10:48 crc kubenswrapper[4776]: I0128 07:10:48.693079 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"259f2bd2-3855-4ebb-8eeb-1457a26c74ae","Type":"ContainerStarted","Data":"18b73b14c9ad6d0365435411c808c99ef4b8bff7a400145967ffa2dbb25fe2ca"} Jan 28 07:10:49 crc kubenswrapper[4776]: I0128 07:10:49.322404 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af39015e-a0f9-4ebd-b6c5-865f3081b2da" path="/var/lib/kubelet/pods/af39015e-a0f9-4ebd-b6c5-865f3081b2da/volumes" Jan 28 07:10:49 crc kubenswrapper[4776]: I0128 07:10:49.707273 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"259f2bd2-3855-4ebb-8eeb-1457a26c74ae","Type":"ContainerStarted","Data":"75dce78febb86640f96400043b0dfa1fe31b6592de1c0a641c6cf09ffc1a9597"} Jan 28 07:10:50 crc kubenswrapper[4776]: I0128 07:10:50.721376 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"259f2bd2-3855-4ebb-8eeb-1457a26c74ae","Type":"ContainerStarted","Data":"8f5a55bd65926555603128e0bd50215b96454478b2e7855cbcf4dad0d7b3fc52"} Jan 28 07:10:50 crc kubenswrapper[4776]: I0128 07:10:50.723178 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 07:10:50 crc kubenswrapper[4776]: I0128 07:10:50.757705 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.201595742 podStartE2EDuration="5.757691231s" podCreationTimestamp="2026-01-28 07:10:45 +0000 UTC" firstStartedPulling="2026-01-28 07:10:46.599782382 +0000 UTC m=+1218.015442542" lastFinishedPulling="2026-01-28 07:10:50.155877871 +0000 UTC m=+1221.571538031" observedRunningTime="2026-01-28 07:10:50.753829217 +0000 UTC m=+1222.169489377" watchObservedRunningTime="2026-01-28 07:10:50.757691231 +0000 UTC m=+1222.173351391" Jan 28 07:10:51 crc kubenswrapper[4776]: I0128 07:10:51.451436 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-757b4f8459-ckcck" podUID="af39015e-a0f9-4ebd-b6c5-865f3081b2da" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.206:5353: i/o timeout" Jan 28 07:10:51 crc kubenswrapper[4776]: I0128 07:10:51.748577 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-t2477" event={"ID":"2c9c2ccb-51d7-4307-8536-11657989c02d","Type":"ContainerDied","Data":"2a2fcc8f96b985d910f5b4e5f353da1467c1090ce018f09c73fe3b6641d262f4"} Jan 28 07:10:51 crc kubenswrapper[4776]: I0128 07:10:51.748578 4776 generic.go:334] "Generic (PLEG): container finished" podID="2c9c2ccb-51d7-4307-8536-11657989c02d" containerID="2a2fcc8f96b985d910f5b4e5f353da1467c1090ce018f09c73fe3b6641d262f4" exitCode=0 Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.019465 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.019813 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.148228 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-t2477" Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.240094 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-combined-ca-bundle\") pod \"2c9c2ccb-51d7-4307-8536-11657989c02d\" (UID: \"2c9c2ccb-51d7-4307-8536-11657989c02d\") " Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.240147 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-scripts\") pod \"2c9c2ccb-51d7-4307-8536-11657989c02d\" (UID: \"2c9c2ccb-51d7-4307-8536-11657989c02d\") " Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.240312 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-config-data\") pod \"2c9c2ccb-51d7-4307-8536-11657989c02d\" (UID: \"2c9c2ccb-51d7-4307-8536-11657989c02d\") " Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.240402 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5jz2\" (UniqueName: \"kubernetes.io/projected/2c9c2ccb-51d7-4307-8536-11657989c02d-kube-api-access-p5jz2\") pod \"2c9c2ccb-51d7-4307-8536-11657989c02d\" (UID: \"2c9c2ccb-51d7-4307-8536-11657989c02d\") " Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.247396 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c9c2ccb-51d7-4307-8536-11657989c02d-kube-api-access-p5jz2" (OuterVolumeSpecName: "kube-api-access-p5jz2") pod "2c9c2ccb-51d7-4307-8536-11657989c02d" (UID: "2c9c2ccb-51d7-4307-8536-11657989c02d"). InnerVolumeSpecName "kube-api-access-p5jz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.247415 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-scripts" (OuterVolumeSpecName: "scripts") pod "2c9c2ccb-51d7-4307-8536-11657989c02d" (UID: "2c9c2ccb-51d7-4307-8536-11657989c02d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.280745 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c9c2ccb-51d7-4307-8536-11657989c02d" (UID: "2c9c2ccb-51d7-4307-8536-11657989c02d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.311701 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-config-data" (OuterVolumeSpecName: "config-data") pod "2c9c2ccb-51d7-4307-8536-11657989c02d" (UID: "2c9c2ccb-51d7-4307-8536-11657989c02d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.343302 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.343340 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.343352 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c9c2ccb-51d7-4307-8536-11657989c02d-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.343363 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5jz2\" (UniqueName: \"kubernetes.io/projected/2c9c2ccb-51d7-4307-8536-11657989c02d-kube-api-access-p5jz2\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.768303 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-t2477" event={"ID":"2c9c2ccb-51d7-4307-8536-11657989c02d","Type":"ContainerDied","Data":"5e65a6fa73dced4c02e1f49be0c6d1d0df23f9b4491c54b96749a1032c3a9bc1"} Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.768364 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e65a6fa73dced4c02e1f49be0c6d1d0df23f9b4491c54b96749a1032c3a9bc1" Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.768389 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-t2477" Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.897003 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.897926 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.902154 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.979849 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.980398 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1a38d0cc-0e43-43ae-9710-9689abdfcb15" containerName="nova-api-log" containerID="cri-o://bd5ff175056341e11cca328978de4dfdeeed2409acf042a0709e21112a44f788" gracePeriod=30 Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.980486 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1a38d0cc-0e43-43ae-9710-9689abdfcb15" containerName="nova-api-api" containerID="cri-o://006b9ca974fc85af3c882de1d9c8076b70bcf1f371077c47128aa0cf674cab12" gracePeriod=30 Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.994488 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1a38d0cc-0e43-43ae-9710-9689abdfcb15" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": EOF" Jan 28 07:10:53 crc kubenswrapper[4776]: I0128 07:10:53.994526 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1a38d0cc-0e43-43ae-9710-9689abdfcb15" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": EOF" Jan 28 07:10:54 crc kubenswrapper[4776]: I0128 07:10:54.002564 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 07:10:54 crc kubenswrapper[4776]: I0128 07:10:54.003155 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce" containerName="nova-scheduler-scheduler" containerID="cri-o://86e58c5263e9f7aecd0c2be607f20ad6d9a31d70dd29c5d2d85d9fff8acb7806" gracePeriod=30 Jan 28 07:10:54 crc kubenswrapper[4776]: I0128 07:10:54.035262 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 07:10:54 crc kubenswrapper[4776]: E0128 07:10:54.550971 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86e58c5263e9f7aecd0c2be607f20ad6d9a31d70dd29c5d2d85d9fff8acb7806" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 28 07:10:54 crc kubenswrapper[4776]: E0128 07:10:54.552690 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86e58c5263e9f7aecd0c2be607f20ad6d9a31d70dd29c5d2d85d9fff8acb7806" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 28 07:10:54 crc kubenswrapper[4776]: E0128 07:10:54.555995 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86e58c5263e9f7aecd0c2be607f20ad6d9a31d70dd29c5d2d85d9fff8acb7806" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 28 07:10:54 crc kubenswrapper[4776]: E0128 07:10:54.556033 4776 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce" containerName="nova-scheduler-scheduler" Jan 28 07:10:54 crc kubenswrapper[4776]: I0128 07:10:54.783987 4776 generic.go:334] "Generic (PLEG): container finished" podID="1a38d0cc-0e43-43ae-9710-9689abdfcb15" containerID="bd5ff175056341e11cca328978de4dfdeeed2409acf042a0709e21112a44f788" exitCode=143 Jan 28 07:10:54 crc kubenswrapper[4776]: I0128 07:10:54.785634 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a38d0cc-0e43-43ae-9710-9689abdfcb15","Type":"ContainerDied","Data":"bd5ff175056341e11cca328978de4dfdeeed2409acf042a0709e21112a44f788"} Jan 28 07:10:54 crc kubenswrapper[4776]: I0128 07:10:54.804510 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 28 07:10:55 crc kubenswrapper[4776]: I0128 07:10:55.795005 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e08b1134-79fa-4e19-9762-7315e271ff02" containerName="nova-metadata-log" containerID="cri-o://7db9e65bcc38a94b972fd7d9f72d7e5c61bc8a4cdca3a3fb5cdb59870149f8f9" gracePeriod=30 Jan 28 07:10:55 crc kubenswrapper[4776]: I0128 07:10:55.795068 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e08b1134-79fa-4e19-9762-7315e271ff02" containerName="nova-metadata-metadata" containerID="cri-o://23bc6ad9f0a018fec2acd772fed3628e675050bcd8beaf521843fdac42d028e2" gracePeriod=30 Jan 28 07:10:56 crc kubenswrapper[4776]: I0128 07:10:56.831269 4776 generic.go:334] "Generic (PLEG): container finished" podID="e08b1134-79fa-4e19-9762-7315e271ff02" containerID="7db9e65bcc38a94b972fd7d9f72d7e5c61bc8a4cdca3a3fb5cdb59870149f8f9" exitCode=143 Jan 28 07:10:56 crc kubenswrapper[4776]: I0128 07:10:56.831563 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e08b1134-79fa-4e19-9762-7315e271ff02","Type":"ContainerDied","Data":"7db9e65bcc38a94b972fd7d9f72d7e5c61bc8a4cdca3a3fb5cdb59870149f8f9"} Jan 28 07:10:58 crc kubenswrapper[4776]: I0128 07:10:58.764994 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 07:10:58 crc kubenswrapper[4776]: I0128 07:10:58.854374 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-config-data\") pod \"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce\" (UID: \"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce\") " Jan 28 07:10:58 crc kubenswrapper[4776]: I0128 07:10:58.854573 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgvmn\" (UniqueName: \"kubernetes.io/projected/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-kube-api-access-jgvmn\") pod \"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce\" (UID: \"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce\") " Jan 28 07:10:58 crc kubenswrapper[4776]: I0128 07:10:58.854620 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-combined-ca-bundle\") pod \"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce\" (UID: \"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce\") " Jan 28 07:10:58 crc kubenswrapper[4776]: I0128 07:10:58.862741 4776 generic.go:334] "Generic (PLEG): container finished" podID="e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce" containerID="86e58c5263e9f7aecd0c2be607f20ad6d9a31d70dd29c5d2d85d9fff8acb7806" exitCode=0 Jan 28 07:10:58 crc kubenswrapper[4776]: I0128 07:10:58.863094 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce","Type":"ContainerDied","Data":"86e58c5263e9f7aecd0c2be607f20ad6d9a31d70dd29c5d2d85d9fff8acb7806"} Jan 28 07:10:58 crc kubenswrapper[4776]: I0128 07:10:58.863113 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 07:10:58 crc kubenswrapper[4776]: I0128 07:10:58.863136 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce","Type":"ContainerDied","Data":"8cadda9a7b4d4009d05e5d9ab512af2c413d26f330dde59f2ddc0274c2a2b2f1"} Jan 28 07:10:58 crc kubenswrapper[4776]: I0128 07:10:58.863160 4776 scope.go:117] "RemoveContainer" containerID="86e58c5263e9f7aecd0c2be607f20ad6d9a31d70dd29c5d2d85d9fff8acb7806" Jan 28 07:10:58 crc kubenswrapper[4776]: I0128 07:10:58.866418 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-kube-api-access-jgvmn" (OuterVolumeSpecName: "kube-api-access-jgvmn") pod "e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce" (UID: "e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce"). InnerVolumeSpecName "kube-api-access-jgvmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:10:58 crc kubenswrapper[4776]: I0128 07:10:58.905519 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-config-data" (OuterVolumeSpecName: "config-data") pod "e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce" (UID: "e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:58 crc kubenswrapper[4776]: I0128 07:10:58.908494 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce" (UID: "e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:58 crc kubenswrapper[4776]: I0128 07:10:58.954246 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e08b1134-79fa-4e19-9762-7315e271ff02" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": read tcp 10.217.0.2:34848->10.217.0.213:8775: read: connection reset by peer" Jan 28 07:10:58 crc kubenswrapper[4776]: I0128 07:10:58.954274 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e08b1134-79fa-4e19-9762-7315e271ff02" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": read tcp 10.217.0.2:34840->10.217.0.213:8775: read: connection reset by peer" Jan 28 07:10:58 crc kubenswrapper[4776]: I0128 07:10:58.956903 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:58 crc kubenswrapper[4776]: I0128 07:10:58.956935 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:58 crc kubenswrapper[4776]: I0128 07:10:58.956948 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgvmn\" (UniqueName: \"kubernetes.io/projected/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce-kube-api-access-jgvmn\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.061446 4776 scope.go:117] "RemoveContainer" containerID="86e58c5263e9f7aecd0c2be607f20ad6d9a31d70dd29c5d2d85d9fff8acb7806" Jan 28 07:10:59 crc kubenswrapper[4776]: E0128 07:10:59.062059 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86e58c5263e9f7aecd0c2be607f20ad6d9a31d70dd29c5d2d85d9fff8acb7806\": container with ID starting with 86e58c5263e9f7aecd0c2be607f20ad6d9a31d70dd29c5d2d85d9fff8acb7806 not found: ID does not exist" containerID="86e58c5263e9f7aecd0c2be607f20ad6d9a31d70dd29c5d2d85d9fff8acb7806" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.062110 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86e58c5263e9f7aecd0c2be607f20ad6d9a31d70dd29c5d2d85d9fff8acb7806"} err="failed to get container status \"86e58c5263e9f7aecd0c2be607f20ad6d9a31d70dd29c5d2d85d9fff8acb7806\": rpc error: code = NotFound desc = could not find container \"86e58c5263e9f7aecd0c2be607f20ad6d9a31d70dd29c5d2d85d9fff8acb7806\": container with ID starting with 86e58c5263e9f7aecd0c2be607f20ad6d9a31d70dd29c5d2d85d9fff8acb7806 not found: ID does not exist" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.209723 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.219795 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.236745 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 07:10:59 crc kubenswrapper[4776]: E0128 07:10:59.237132 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af39015e-a0f9-4ebd-b6c5-865f3081b2da" containerName="init" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.237145 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="af39015e-a0f9-4ebd-b6c5-865f3081b2da" containerName="init" Jan 28 07:10:59 crc kubenswrapper[4776]: E0128 07:10:59.237174 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c9c2ccb-51d7-4307-8536-11657989c02d" containerName="nova-manage" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.237182 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c9c2ccb-51d7-4307-8536-11657989c02d" containerName="nova-manage" Jan 28 07:10:59 crc kubenswrapper[4776]: E0128 07:10:59.237207 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af39015e-a0f9-4ebd-b6c5-865f3081b2da" containerName="dnsmasq-dns" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.237213 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="af39015e-a0f9-4ebd-b6c5-865f3081b2da" containerName="dnsmasq-dns" Jan 28 07:10:59 crc kubenswrapper[4776]: E0128 07:10:59.237227 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce" containerName="nova-scheduler-scheduler" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.237234 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce" containerName="nova-scheduler-scheduler" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.237421 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c9c2ccb-51d7-4307-8536-11657989c02d" containerName="nova-manage" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.237443 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce" containerName="nova-scheduler-scheduler" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.237460 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="af39015e-a0f9-4ebd-b6c5-865f3081b2da" containerName="dnsmasq-dns" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.239286 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.241359 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.255750 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.287943 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.341219 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce" path="/var/lib/kubelet/pods/e28c37f2-b73d-4b7e-b8c8-82baffe4d9ce/volumes" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.363594 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e08b1134-79fa-4e19-9762-7315e271ff02-logs\") pod \"e08b1134-79fa-4e19-9762-7315e271ff02\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.363803 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-config-data\") pod \"e08b1134-79fa-4e19-9762-7315e271ff02\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.363872 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b58w\" (UniqueName: \"kubernetes.io/projected/e08b1134-79fa-4e19-9762-7315e271ff02-kube-api-access-9b58w\") pod \"e08b1134-79fa-4e19-9762-7315e271ff02\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.363904 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-nova-metadata-tls-certs\") pod \"e08b1134-79fa-4e19-9762-7315e271ff02\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.363940 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-combined-ca-bundle\") pod \"e08b1134-79fa-4e19-9762-7315e271ff02\" (UID: \"e08b1134-79fa-4e19-9762-7315e271ff02\") " Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.364239 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e68a0f9-2ffb-43a1-8945-37b6b68b2d43-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7e68a0f9-2ffb-43a1-8945-37b6b68b2d43\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.364294 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz49g\" (UniqueName: \"kubernetes.io/projected/7e68a0f9-2ffb-43a1-8945-37b6b68b2d43-kube-api-access-sz49g\") pod \"nova-scheduler-0\" (UID: \"7e68a0f9-2ffb-43a1-8945-37b6b68b2d43\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.364364 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e68a0f9-2ffb-43a1-8945-37b6b68b2d43-config-data\") pod \"nova-scheduler-0\" (UID: \"7e68a0f9-2ffb-43a1-8945-37b6b68b2d43\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.365534 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e08b1134-79fa-4e19-9762-7315e271ff02-logs" (OuterVolumeSpecName: "logs") pod "e08b1134-79fa-4e19-9762-7315e271ff02" (UID: "e08b1134-79fa-4e19-9762-7315e271ff02"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.370731 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e08b1134-79fa-4e19-9762-7315e271ff02-kube-api-access-9b58w" (OuterVolumeSpecName: "kube-api-access-9b58w") pod "e08b1134-79fa-4e19-9762-7315e271ff02" (UID: "e08b1134-79fa-4e19-9762-7315e271ff02"). InnerVolumeSpecName "kube-api-access-9b58w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.399648 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e08b1134-79fa-4e19-9762-7315e271ff02" (UID: "e08b1134-79fa-4e19-9762-7315e271ff02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.402590 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-config-data" (OuterVolumeSpecName: "config-data") pod "e08b1134-79fa-4e19-9762-7315e271ff02" (UID: "e08b1134-79fa-4e19-9762-7315e271ff02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.454019 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e08b1134-79fa-4e19-9762-7315e271ff02" (UID: "e08b1134-79fa-4e19-9762-7315e271ff02"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.466937 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e68a0f9-2ffb-43a1-8945-37b6b68b2d43-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7e68a0f9-2ffb-43a1-8945-37b6b68b2d43\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.467008 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz49g\" (UniqueName: \"kubernetes.io/projected/7e68a0f9-2ffb-43a1-8945-37b6b68b2d43-kube-api-access-sz49g\") pod \"nova-scheduler-0\" (UID: \"7e68a0f9-2ffb-43a1-8945-37b6b68b2d43\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.467052 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e68a0f9-2ffb-43a1-8945-37b6b68b2d43-config-data\") pod \"nova-scheduler-0\" (UID: \"7e68a0f9-2ffb-43a1-8945-37b6b68b2d43\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.467210 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e08b1134-79fa-4e19-9762-7315e271ff02-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.467227 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.467238 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b58w\" (UniqueName: \"kubernetes.io/projected/e08b1134-79fa-4e19-9762-7315e271ff02-kube-api-access-9b58w\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.467249 4776 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.467257 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08b1134-79fa-4e19-9762-7315e271ff02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.470759 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e68a0f9-2ffb-43a1-8945-37b6b68b2d43-config-data\") pod \"nova-scheduler-0\" (UID: \"7e68a0f9-2ffb-43a1-8945-37b6b68b2d43\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.470855 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e68a0f9-2ffb-43a1-8945-37b6b68b2d43-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7e68a0f9-2ffb-43a1-8945-37b6b68b2d43\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.487598 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz49g\" (UniqueName: \"kubernetes.io/projected/7e68a0f9-2ffb-43a1-8945-37b6b68b2d43-kube-api-access-sz49g\") pod \"nova-scheduler-0\" (UID: \"7e68a0f9-2ffb-43a1-8945-37b6b68b2d43\") " pod="openstack/nova-scheduler-0" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.576735 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.830160 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.884994 4776 generic.go:334] "Generic (PLEG): container finished" podID="e08b1134-79fa-4e19-9762-7315e271ff02" containerID="23bc6ad9f0a018fec2acd772fed3628e675050bcd8beaf521843fdac42d028e2" exitCode=0 Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.885387 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e08b1134-79fa-4e19-9762-7315e271ff02","Type":"ContainerDied","Data":"23bc6ad9f0a018fec2acd772fed3628e675050bcd8beaf521843fdac42d028e2"} Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.885417 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e08b1134-79fa-4e19-9762-7315e271ff02","Type":"ContainerDied","Data":"6e88819acfa0f786a387cf4fc95904812512579bba68a020f7142a7e187350e4"} Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.885437 4776 scope.go:117] "RemoveContainer" containerID="23bc6ad9f0a018fec2acd772fed3628e675050bcd8beaf521843fdac42d028e2" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.885581 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.905566 4776 generic.go:334] "Generic (PLEG): container finished" podID="1a38d0cc-0e43-43ae-9710-9689abdfcb15" containerID="006b9ca974fc85af3c882de1d9c8076b70bcf1f371077c47128aa0cf674cab12" exitCode=0 Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.905672 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.905680 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a38d0cc-0e43-43ae-9710-9689abdfcb15","Type":"ContainerDied","Data":"006b9ca974fc85af3c882de1d9c8076b70bcf1f371077c47128aa0cf674cab12"} Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.906047 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a38d0cc-0e43-43ae-9710-9689abdfcb15","Type":"ContainerDied","Data":"3e1b85fc3544288b817da43e71db291a83ed0a0643f48eeeb0a51bcffc8e5b86"} Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.937902 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.945617 4776 scope.go:117] "RemoveContainer" containerID="7db9e65bcc38a94b972fd7d9f72d7e5c61bc8a4cdca3a3fb5cdb59870149f8f9" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.947500 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.961954 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 28 07:10:59 crc kubenswrapper[4776]: E0128 07:10:59.962362 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a38d0cc-0e43-43ae-9710-9689abdfcb15" containerName="nova-api-api" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.962374 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a38d0cc-0e43-43ae-9710-9689abdfcb15" containerName="nova-api-api" Jan 28 07:10:59 crc kubenswrapper[4776]: E0128 07:10:59.962393 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e08b1134-79fa-4e19-9762-7315e271ff02" containerName="nova-metadata-metadata" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.962401 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08b1134-79fa-4e19-9762-7315e271ff02" containerName="nova-metadata-metadata" Jan 28 07:10:59 crc kubenswrapper[4776]: E0128 07:10:59.962428 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e08b1134-79fa-4e19-9762-7315e271ff02" containerName="nova-metadata-log" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.962435 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08b1134-79fa-4e19-9762-7315e271ff02" containerName="nova-metadata-log" Jan 28 07:10:59 crc kubenswrapper[4776]: E0128 07:10:59.962452 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a38d0cc-0e43-43ae-9710-9689abdfcb15" containerName="nova-api-log" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.962458 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a38d0cc-0e43-43ae-9710-9689abdfcb15" containerName="nova-api-log" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.962634 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a38d0cc-0e43-43ae-9710-9689abdfcb15" containerName="nova-api-log" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.962644 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e08b1134-79fa-4e19-9762-7315e271ff02" containerName="nova-metadata-metadata" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.962665 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a38d0cc-0e43-43ae-9710-9689abdfcb15" containerName="nova-api-api" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.962680 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e08b1134-79fa-4e19-9762-7315e271ff02" containerName="nova-metadata-log" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.973860 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.978983 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.982806 4776 scope.go:117] "RemoveContainer" containerID="23bc6ad9f0a018fec2acd772fed3628e675050bcd8beaf521843fdac42d028e2" Jan 28 07:10:59 crc kubenswrapper[4776]: E0128 07:10:59.983647 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23bc6ad9f0a018fec2acd772fed3628e675050bcd8beaf521843fdac42d028e2\": container with ID starting with 23bc6ad9f0a018fec2acd772fed3628e675050bcd8beaf521843fdac42d028e2 not found: ID does not exist" containerID="23bc6ad9f0a018fec2acd772fed3628e675050bcd8beaf521843fdac42d028e2" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.983682 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23bc6ad9f0a018fec2acd772fed3628e675050bcd8beaf521843fdac42d028e2"} err="failed to get container status \"23bc6ad9f0a018fec2acd772fed3628e675050bcd8beaf521843fdac42d028e2\": rpc error: code = NotFound desc = could not find container \"23bc6ad9f0a018fec2acd772fed3628e675050bcd8beaf521843fdac42d028e2\": container with ID starting with 23bc6ad9f0a018fec2acd772fed3628e675050bcd8beaf521843fdac42d028e2 not found: ID does not exist" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.983715 4776 scope.go:117] "RemoveContainer" containerID="7db9e65bcc38a94b972fd7d9f72d7e5c61bc8a4cdca3a3fb5cdb59870149f8f9" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.984010 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.984110 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-combined-ca-bundle\") pod \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.984183 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-public-tls-certs\") pod \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.984249 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a38d0cc-0e43-43ae-9710-9689abdfcb15-logs\") pod \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.984326 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-internal-tls-certs\") pod \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.984379 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzw6n\" (UniqueName: \"kubernetes.io/projected/1a38d0cc-0e43-43ae-9710-9689abdfcb15-kube-api-access-pzw6n\") pod \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.984439 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-config-data\") pod \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\" (UID: \"1a38d0cc-0e43-43ae-9710-9689abdfcb15\") " Jan 28 07:10:59 crc kubenswrapper[4776]: E0128 07:10:59.986261 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7db9e65bcc38a94b972fd7d9f72d7e5c61bc8a4cdca3a3fb5cdb59870149f8f9\": container with ID starting with 7db9e65bcc38a94b972fd7d9f72d7e5c61bc8a4cdca3a3fb5cdb59870149f8f9 not found: ID does not exist" containerID="7db9e65bcc38a94b972fd7d9f72d7e5c61bc8a4cdca3a3fb5cdb59870149f8f9" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.986310 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db9e65bcc38a94b972fd7d9f72d7e5c61bc8a4cdca3a3fb5cdb59870149f8f9"} err="failed to get container status \"7db9e65bcc38a94b972fd7d9f72d7e5c61bc8a4cdca3a3fb5cdb59870149f8f9\": rpc error: code = NotFound desc = could not find container \"7db9e65bcc38a94b972fd7d9f72d7e5c61bc8a4cdca3a3fb5cdb59870149f8f9\": container with ID starting with 7db9e65bcc38a94b972fd7d9f72d7e5c61bc8a4cdca3a3fb5cdb59870149f8f9 not found: ID does not exist" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.986399 4776 scope.go:117] "RemoveContainer" containerID="006b9ca974fc85af3c882de1d9c8076b70bcf1f371077c47128aa0cf674cab12" Jan 28 07:10:59 crc kubenswrapper[4776]: I0128 07:10:59.987787 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a38d0cc-0e43-43ae-9710-9689abdfcb15-logs" (OuterVolumeSpecName: "logs") pod "1a38d0cc-0e43-43ae-9710-9689abdfcb15" (UID: "1a38d0cc-0e43-43ae-9710-9689abdfcb15"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.002248 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.002759 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a38d0cc-0e43-43ae-9710-9689abdfcb15-kube-api-access-pzw6n" (OuterVolumeSpecName: "kube-api-access-pzw6n") pod "1a38d0cc-0e43-43ae-9710-9689abdfcb15" (UID: "1a38d0cc-0e43-43ae-9710-9689abdfcb15"). InnerVolumeSpecName "kube-api-access-pzw6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.025619 4776 scope.go:117] "RemoveContainer" containerID="bd5ff175056341e11cca328978de4dfdeeed2409acf042a0709e21112a44f788" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.025943 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-config-data" (OuterVolumeSpecName: "config-data") pod "1a38d0cc-0e43-43ae-9710-9689abdfcb15" (UID: "1a38d0cc-0e43-43ae-9710-9689abdfcb15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.026441 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a38d0cc-0e43-43ae-9710-9689abdfcb15" (UID: "1a38d0cc-0e43-43ae-9710-9689abdfcb15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.048780 4776 scope.go:117] "RemoveContainer" containerID="006b9ca974fc85af3c882de1d9c8076b70bcf1f371077c47128aa0cf674cab12" Jan 28 07:11:00 crc kubenswrapper[4776]: E0128 07:11:00.049280 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"006b9ca974fc85af3c882de1d9c8076b70bcf1f371077c47128aa0cf674cab12\": container with ID starting with 006b9ca974fc85af3c882de1d9c8076b70bcf1f371077c47128aa0cf674cab12 not found: ID does not exist" containerID="006b9ca974fc85af3c882de1d9c8076b70bcf1f371077c47128aa0cf674cab12" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.049313 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006b9ca974fc85af3c882de1d9c8076b70bcf1f371077c47128aa0cf674cab12"} err="failed to get container status \"006b9ca974fc85af3c882de1d9c8076b70bcf1f371077c47128aa0cf674cab12\": rpc error: code = NotFound desc = could not find container \"006b9ca974fc85af3c882de1d9c8076b70bcf1f371077c47128aa0cf674cab12\": container with ID starting with 006b9ca974fc85af3c882de1d9c8076b70bcf1f371077c47128aa0cf674cab12 not found: ID does not exist" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.049355 4776 scope.go:117] "RemoveContainer" containerID="bd5ff175056341e11cca328978de4dfdeeed2409acf042a0709e21112a44f788" Jan 28 07:11:00 crc kubenswrapper[4776]: E0128 07:11:00.050318 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd5ff175056341e11cca328978de4dfdeeed2409acf042a0709e21112a44f788\": container with ID starting with bd5ff175056341e11cca328978de4dfdeeed2409acf042a0709e21112a44f788 not found: ID does not exist" containerID="bd5ff175056341e11cca328978de4dfdeeed2409acf042a0709e21112a44f788" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.050352 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd5ff175056341e11cca328978de4dfdeeed2409acf042a0709e21112a44f788"} err="failed to get container status \"bd5ff175056341e11cca328978de4dfdeeed2409acf042a0709e21112a44f788\": rpc error: code = NotFound desc = could not find container \"bd5ff175056341e11cca328978de4dfdeeed2409acf042a0709e21112a44f788\": container with ID starting with bd5ff175056341e11cca328978de4dfdeeed2409acf042a0709e21112a44f788 not found: ID does not exist" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.053962 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1a38d0cc-0e43-43ae-9710-9689abdfcb15" (UID: "1a38d0cc-0e43-43ae-9710-9689abdfcb15"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.071013 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1a38d0cc-0e43-43ae-9710-9689abdfcb15" (UID: "1a38d0cc-0e43-43ae-9710-9689abdfcb15"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.086267 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31eb87d0-ab51-4738-8205-b515b8b57cf1-config-data\") pod \"nova-metadata-0\" (UID: \"31eb87d0-ab51-4738-8205-b515b8b57cf1\") " pod="openstack/nova-metadata-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.086333 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccfck\" (UniqueName: \"kubernetes.io/projected/31eb87d0-ab51-4738-8205-b515b8b57cf1-kube-api-access-ccfck\") pod \"nova-metadata-0\" (UID: \"31eb87d0-ab51-4738-8205-b515b8b57cf1\") " pod="openstack/nova-metadata-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.086379 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31eb87d0-ab51-4738-8205-b515b8b57cf1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"31eb87d0-ab51-4738-8205-b515b8b57cf1\") " pod="openstack/nova-metadata-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.086407 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31eb87d0-ab51-4738-8205-b515b8b57cf1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"31eb87d0-ab51-4738-8205-b515b8b57cf1\") " pod="openstack/nova-metadata-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.086459 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31eb87d0-ab51-4738-8205-b515b8b57cf1-logs\") pod \"nova-metadata-0\" (UID: \"31eb87d0-ab51-4738-8205-b515b8b57cf1\") " pod="openstack/nova-metadata-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.086529 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a38d0cc-0e43-43ae-9710-9689abdfcb15-logs\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.086540 4776 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.086565 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzw6n\" (UniqueName: \"kubernetes.io/projected/1a38d0cc-0e43-43ae-9710-9689abdfcb15-kube-api-access-pzw6n\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.086575 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.086583 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.086592 4776 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a38d0cc-0e43-43ae-9710-9689abdfcb15-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.188575 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31eb87d0-ab51-4738-8205-b515b8b57cf1-config-data\") pod \"nova-metadata-0\" (UID: \"31eb87d0-ab51-4738-8205-b515b8b57cf1\") " pod="openstack/nova-metadata-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.188704 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccfck\" (UniqueName: \"kubernetes.io/projected/31eb87d0-ab51-4738-8205-b515b8b57cf1-kube-api-access-ccfck\") pod \"nova-metadata-0\" (UID: \"31eb87d0-ab51-4738-8205-b515b8b57cf1\") " pod="openstack/nova-metadata-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.188789 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31eb87d0-ab51-4738-8205-b515b8b57cf1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"31eb87d0-ab51-4738-8205-b515b8b57cf1\") " pod="openstack/nova-metadata-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.188848 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31eb87d0-ab51-4738-8205-b515b8b57cf1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"31eb87d0-ab51-4738-8205-b515b8b57cf1\") " pod="openstack/nova-metadata-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.188888 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31eb87d0-ab51-4738-8205-b515b8b57cf1-logs\") pod \"nova-metadata-0\" (UID: \"31eb87d0-ab51-4738-8205-b515b8b57cf1\") " pod="openstack/nova-metadata-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.189452 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31eb87d0-ab51-4738-8205-b515b8b57cf1-logs\") pod \"nova-metadata-0\" (UID: \"31eb87d0-ab51-4738-8205-b515b8b57cf1\") " pod="openstack/nova-metadata-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.192477 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31eb87d0-ab51-4738-8205-b515b8b57cf1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"31eb87d0-ab51-4738-8205-b515b8b57cf1\") " pod="openstack/nova-metadata-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.192519 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31eb87d0-ab51-4738-8205-b515b8b57cf1-config-data\") pod \"nova-metadata-0\" (UID: \"31eb87d0-ab51-4738-8205-b515b8b57cf1\") " pod="openstack/nova-metadata-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.194684 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31eb87d0-ab51-4738-8205-b515b8b57cf1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"31eb87d0-ab51-4738-8205-b515b8b57cf1\") " pod="openstack/nova-metadata-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.203506 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccfck\" (UniqueName: \"kubernetes.io/projected/31eb87d0-ab51-4738-8205-b515b8b57cf1-kube-api-access-ccfck\") pod \"nova-metadata-0\" (UID: \"31eb87d0-ab51-4738-8205-b515b8b57cf1\") " pod="openstack/nova-metadata-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.252306 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.271450 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.288651 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.298218 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.302753 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.306072 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.309284 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.309716 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.310052 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.328285 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.393381 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca740aa-b1f4-4878-93f4-116c2c17ff53-config-data\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.393718 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ca740aa-b1f4-4878-93f4-116c2c17ff53-public-tls-certs\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.393774 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca740aa-b1f4-4878-93f4-116c2c17ff53-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.393894 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chdlh\" (UniqueName: \"kubernetes.io/projected/9ca740aa-b1f4-4878-93f4-116c2c17ff53-kube-api-access-chdlh\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.393952 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ca740aa-b1f4-4878-93f4-116c2c17ff53-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.394057 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ca740aa-b1f4-4878-93f4-116c2c17ff53-logs\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.495414 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chdlh\" (UniqueName: \"kubernetes.io/projected/9ca740aa-b1f4-4878-93f4-116c2c17ff53-kube-api-access-chdlh\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.495471 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ca740aa-b1f4-4878-93f4-116c2c17ff53-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.495511 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ca740aa-b1f4-4878-93f4-116c2c17ff53-logs\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.495603 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca740aa-b1f4-4878-93f4-116c2c17ff53-config-data\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.495631 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ca740aa-b1f4-4878-93f4-116c2c17ff53-public-tls-certs\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.496254 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ca740aa-b1f4-4878-93f4-116c2c17ff53-logs\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.496449 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca740aa-b1f4-4878-93f4-116c2c17ff53-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.499846 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ca740aa-b1f4-4878-93f4-116c2c17ff53-public-tls-certs\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.499976 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca740aa-b1f4-4878-93f4-116c2c17ff53-config-data\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.502235 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ca740aa-b1f4-4878-93f4-116c2c17ff53-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.502284 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca740aa-b1f4-4878-93f4-116c2c17ff53-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.512214 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chdlh\" (UniqueName: \"kubernetes.io/projected/9ca740aa-b1f4-4878-93f4-116c2c17ff53-kube-api-access-chdlh\") pod \"nova-api-0\" (UID: \"9ca740aa-b1f4-4878-93f4-116c2c17ff53\") " pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.618321 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.773793 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 07:11:00 crc kubenswrapper[4776]: W0128 07:11:00.783132 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31eb87d0_ab51_4738_8205_b515b8b57cf1.slice/crio-70791d3ff5bc97ea870a6191b2a9c7116e83695e9e126dd2c622bed0c475c569 WatchSource:0}: Error finding container 70791d3ff5bc97ea870a6191b2a9c7116e83695e9e126dd2c622bed0c475c569: Status 404 returned error can't find the container with id 70791d3ff5bc97ea870a6191b2a9c7116e83695e9e126dd2c622bed0c475c569 Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.919577 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7e68a0f9-2ffb-43a1-8945-37b6b68b2d43","Type":"ContainerStarted","Data":"d9cb855e4da28ec061427faf70b5290e5a76443bc26644733c67a4190210eac5"} Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.919614 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7e68a0f9-2ffb-43a1-8945-37b6b68b2d43","Type":"ContainerStarted","Data":"7f6f717e21df4209854c144eb598074f002b88e21dc02afdf9f232ef4743062f"} Jan 28 07:11:00 crc kubenswrapper[4776]: I0128 07:11:00.923616 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"31eb87d0-ab51-4738-8205-b515b8b57cf1","Type":"ContainerStarted","Data":"70791d3ff5bc97ea870a6191b2a9c7116e83695e9e126dd2c622bed0c475c569"} Jan 28 07:11:01 crc kubenswrapper[4776]: I0128 07:11:01.082061 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.082038996 podStartE2EDuration="2.082038996s" podCreationTimestamp="2026-01-28 07:10:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:11:00.939495784 +0000 UTC m=+1232.355155934" watchObservedRunningTime="2026-01-28 07:11:01.082038996 +0000 UTC m=+1232.497699156" Jan 28 07:11:01 crc kubenswrapper[4776]: I0128 07:11:01.090103 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 07:11:01 crc kubenswrapper[4776]: W0128 07:11:01.092308 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ca740aa_b1f4_4878_93f4_116c2c17ff53.slice/crio-bd26e303f17a65d47016469d82fc93172e8fd11dc295906d629d16fbe4bdee96 WatchSource:0}: Error finding container bd26e303f17a65d47016469d82fc93172e8fd11dc295906d629d16fbe4bdee96: Status 404 returned error can't find the container with id bd26e303f17a65d47016469d82fc93172e8fd11dc295906d629d16fbe4bdee96 Jan 28 07:11:01 crc kubenswrapper[4776]: I0128 07:11:01.321360 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a38d0cc-0e43-43ae-9710-9689abdfcb15" path="/var/lib/kubelet/pods/1a38d0cc-0e43-43ae-9710-9689abdfcb15/volumes" Jan 28 07:11:01 crc kubenswrapper[4776]: I0128 07:11:01.322516 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e08b1134-79fa-4e19-9762-7315e271ff02" path="/var/lib/kubelet/pods/e08b1134-79fa-4e19-9762-7315e271ff02/volumes" Jan 28 07:11:01 crc kubenswrapper[4776]: I0128 07:11:01.935510 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"31eb87d0-ab51-4738-8205-b515b8b57cf1","Type":"ContainerStarted","Data":"574082ca55cd0183008e47869f3af57e9830259d8d7a94fd695bbd55aa7bde68"} Jan 28 07:11:01 crc kubenswrapper[4776]: I0128 07:11:01.935581 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"31eb87d0-ab51-4738-8205-b515b8b57cf1","Type":"ContainerStarted","Data":"9ba6eccbeff017d4072b3a6213edeedfbca98a46a02d50cd61cea2950e9c3a87"} Jan 28 07:11:01 crc kubenswrapper[4776]: I0128 07:11:01.938870 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ca740aa-b1f4-4878-93f4-116c2c17ff53","Type":"ContainerStarted","Data":"de0099816ea3cd4e5a518ed7912d2d68b5ee0bc85e0d42a6b70d16eacc9954bc"} Jan 28 07:11:01 crc kubenswrapper[4776]: I0128 07:11:01.938918 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ca740aa-b1f4-4878-93f4-116c2c17ff53","Type":"ContainerStarted","Data":"070bf478ec37b4973a75a56e5ef7c0f73aa0c4405c9c38664fac8902a8bb8372"} Jan 28 07:11:01 crc kubenswrapper[4776]: I0128 07:11:01.938928 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ca740aa-b1f4-4878-93f4-116c2c17ff53","Type":"ContainerStarted","Data":"bd26e303f17a65d47016469d82fc93172e8fd11dc295906d629d16fbe4bdee96"} Jan 28 07:11:01 crc kubenswrapper[4776]: I0128 07:11:01.956368 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.956349188 podStartE2EDuration="2.956349188s" podCreationTimestamp="2026-01-28 07:10:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:11:01.953762598 +0000 UTC m=+1233.369422758" watchObservedRunningTime="2026-01-28 07:11:01.956349188 +0000 UTC m=+1233.372009348" Jan 28 07:11:01 crc kubenswrapper[4776]: I0128 07:11:01.982113 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.982093813 podStartE2EDuration="1.982093813s" podCreationTimestamp="2026-01-28 07:11:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:11:01.975711291 +0000 UTC m=+1233.391371491" watchObservedRunningTime="2026-01-28 07:11:01.982093813 +0000 UTC m=+1233.397753973" Jan 28 07:11:03 crc kubenswrapper[4776]: I0128 07:11:03.852536 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:11:03 crc kubenswrapper[4776]: I0128 07:11:03.853182 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:11:04 crc kubenswrapper[4776]: I0128 07:11:04.577521 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 28 07:11:05 crc kubenswrapper[4776]: I0128 07:11:05.298739 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 07:11:05 crc kubenswrapper[4776]: I0128 07:11:05.298802 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 07:11:09 crc kubenswrapper[4776]: I0128 07:11:09.577961 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 28 07:11:09 crc kubenswrapper[4776]: I0128 07:11:09.616326 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 28 07:11:10 crc kubenswrapper[4776]: I0128 07:11:10.081293 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 28 07:11:10 crc kubenswrapper[4776]: I0128 07:11:10.299478 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 28 07:11:10 crc kubenswrapper[4776]: I0128 07:11:10.299581 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 28 07:11:10 crc kubenswrapper[4776]: I0128 07:11:10.619273 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 07:11:10 crc kubenswrapper[4776]: I0128 07:11:10.620767 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 07:11:11 crc kubenswrapper[4776]: I0128 07:11:11.313839 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="31eb87d0-ab51-4738-8205-b515b8b57cf1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 07:11:11 crc kubenswrapper[4776]: I0128 07:11:11.313912 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="31eb87d0-ab51-4738-8205-b515b8b57cf1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 07:11:11 crc kubenswrapper[4776]: I0128 07:11:11.637000 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9ca740aa-b1f4-4878-93f4-116c2c17ff53" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.221:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 07:11:11 crc kubenswrapper[4776]: I0128 07:11:11.637015 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9ca740aa-b1f4-4878-93f4-116c2c17ff53" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.221:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 07:11:16 crc kubenswrapper[4776]: I0128 07:11:16.049411 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 28 07:11:20 crc kubenswrapper[4776]: I0128 07:11:20.305214 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 28 07:11:20 crc kubenswrapper[4776]: I0128 07:11:20.305775 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 28 07:11:20 crc kubenswrapper[4776]: I0128 07:11:20.312152 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 28 07:11:20 crc kubenswrapper[4776]: I0128 07:11:20.312609 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 28 07:11:20 crc kubenswrapper[4776]: I0128 07:11:20.628954 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 28 07:11:20 crc kubenswrapper[4776]: I0128 07:11:20.629328 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 28 07:11:20 crc kubenswrapper[4776]: I0128 07:11:20.630928 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 28 07:11:20 crc kubenswrapper[4776]: I0128 07:11:20.638287 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 28 07:11:21 crc kubenswrapper[4776]: I0128 07:11:21.143887 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 28 07:11:21 crc kubenswrapper[4776]: I0128 07:11:21.152173 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 28 07:11:29 crc kubenswrapper[4776]: I0128 07:11:29.493272 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 07:11:30 crc kubenswrapper[4776]: I0128 07:11:30.559429 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 07:11:33 crc kubenswrapper[4776]: I0128 07:11:33.342114 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="0aae9df9-4aee-48fa-aa96-4f93f55be39f" containerName="rabbitmq" containerID="cri-o://0b881277df6a126ad34bd8b1993d81a7a7c80a1b5531cec40707a0b17ac70562" gracePeriod=604797 Jan 28 07:11:33 crc kubenswrapper[4776]: I0128 07:11:33.851564 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:11:33 crc kubenswrapper[4776]: I0128 07:11:33.851619 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:11:33 crc kubenswrapper[4776]: I0128 07:11:33.851664 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 07:11:33 crc kubenswrapper[4776]: I0128 07:11:33.852287 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b457c6912252b5a2b63136be6ba788b7188f99f3764ee744c448695e511964d0"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 07:11:33 crc kubenswrapper[4776]: I0128 07:11:33.852343 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://b457c6912252b5a2b63136be6ba788b7188f99f3764ee744c448695e511964d0" gracePeriod=600 Jan 28 07:11:34 crc kubenswrapper[4776]: E0128 07:11:34.014054 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3539113f_fe53_40a0_a08c_d7f86951d067.slice/crio-conmon-b457c6912252b5a2b63136be6ba788b7188f99f3764ee744c448695e511964d0.scope\": RecentStats: unable to find data in memory cache]" Jan 28 07:11:34 crc kubenswrapper[4776]: I0128 07:11:34.286727 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="b457c6912252b5a2b63136be6ba788b7188f99f3764ee744c448695e511964d0" exitCode=0 Jan 28 07:11:34 crc kubenswrapper[4776]: I0128 07:11:34.286806 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"b457c6912252b5a2b63136be6ba788b7188f99f3764ee744c448695e511964d0"} Jan 28 07:11:34 crc kubenswrapper[4776]: I0128 07:11:34.287334 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"8ff95d3106ec58750562936f2aa4128ad082c64114f36d54205fbce24b521f3d"} Jan 28 07:11:34 crc kubenswrapper[4776]: I0128 07:11:34.287412 4776 scope.go:117] "RemoveContainer" containerID="ee9888a6c6a796ef3ecd16fb5509f4cb1473705dc001b450840175052867c944" Jan 28 07:11:34 crc kubenswrapper[4776]: I0128 07:11:34.517984 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="c544ad4a-db14-419a-b423-435e8416f597" containerName="rabbitmq" containerID="cri-o://ad54de427c0b143e1a6651cac9dbacb6c50ef52e4a818312af689c999ed3fc92" gracePeriod=604797 Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.060267 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.123911 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-config-data\") pod \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.123948 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.123976 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0aae9df9-4aee-48fa-aa96-4f93f55be39f-pod-info\") pod \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.124013 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-plugins-conf\") pod \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.124073 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-server-conf\") pod \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.124095 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-confd\") pod \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.124109 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-tls\") pod \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.124137 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0aae9df9-4aee-48fa-aa96-4f93f55be39f-erlang-cookie-secret\") pod \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.124157 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-plugins\") pod \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.124170 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjg4m\" (UniqueName: \"kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-kube-api-access-sjg4m\") pod \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.124192 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-erlang-cookie\") pod \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\" (UID: \"0aae9df9-4aee-48fa-aa96-4f93f55be39f\") " Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.124905 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0aae9df9-4aee-48fa-aa96-4f93f55be39f" (UID: "0aae9df9-4aee-48fa-aa96-4f93f55be39f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.126106 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0aae9df9-4aee-48fa-aa96-4f93f55be39f" (UID: "0aae9df9-4aee-48fa-aa96-4f93f55be39f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.126534 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0aae9df9-4aee-48fa-aa96-4f93f55be39f" (UID: "0aae9df9-4aee-48fa-aa96-4f93f55be39f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.130960 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0aae9df9-4aee-48fa-aa96-4f93f55be39f-pod-info" (OuterVolumeSpecName: "pod-info") pod "0aae9df9-4aee-48fa-aa96-4f93f55be39f" (UID: "0aae9df9-4aee-48fa-aa96-4f93f55be39f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.131469 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-kube-api-access-sjg4m" (OuterVolumeSpecName: "kube-api-access-sjg4m") pod "0aae9df9-4aee-48fa-aa96-4f93f55be39f" (UID: "0aae9df9-4aee-48fa-aa96-4f93f55be39f"). InnerVolumeSpecName "kube-api-access-sjg4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.132164 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aae9df9-4aee-48fa-aa96-4f93f55be39f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0aae9df9-4aee-48fa-aa96-4f93f55be39f" (UID: "0aae9df9-4aee-48fa-aa96-4f93f55be39f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.132322 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "0aae9df9-4aee-48fa-aa96-4f93f55be39f" (UID: "0aae9df9-4aee-48fa-aa96-4f93f55be39f"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.143876 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0aae9df9-4aee-48fa-aa96-4f93f55be39f" (UID: "0aae9df9-4aee-48fa-aa96-4f93f55be39f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.155532 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-config-data" (OuterVolumeSpecName: "config-data") pod "0aae9df9-4aee-48fa-aa96-4f93f55be39f" (UID: "0aae9df9-4aee-48fa-aa96-4f93f55be39f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.226250 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.226292 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.226302 4776 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0aae9df9-4aee-48fa-aa96-4f93f55be39f-pod-info\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.226311 4776 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.226321 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.226330 4776 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0aae9df9-4aee-48fa-aa96-4f93f55be39f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.226338 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.226346 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjg4m\" (UniqueName: \"kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-kube-api-access-sjg4m\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.226355 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.246037 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-server-conf" (OuterVolumeSpecName: "server-conf") pod "0aae9df9-4aee-48fa-aa96-4f93f55be39f" (UID: "0aae9df9-4aee-48fa-aa96-4f93f55be39f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.252052 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.266387 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0aae9df9-4aee-48fa-aa96-4f93f55be39f" (UID: "0aae9df9-4aee-48fa-aa96-4f93f55be39f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.328412 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.329591 4776 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0aae9df9-4aee-48fa-aa96-4f93f55be39f-server-conf\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.329703 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0aae9df9-4aee-48fa-aa96-4f93f55be39f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.362749 4776 generic.go:334] "Generic (PLEG): container finished" podID="0aae9df9-4aee-48fa-aa96-4f93f55be39f" containerID="0b881277df6a126ad34bd8b1993d81a7a7c80a1b5531cec40707a0b17ac70562" exitCode=0 Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.362787 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0aae9df9-4aee-48fa-aa96-4f93f55be39f","Type":"ContainerDied","Data":"0b881277df6a126ad34bd8b1993d81a7a7c80a1b5531cec40707a0b17ac70562"} Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.362810 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0aae9df9-4aee-48fa-aa96-4f93f55be39f","Type":"ContainerDied","Data":"79e27f88edee48bf353a689aaa4183425ba6414e5204161a2f2fce5ba58d40e0"} Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.362827 4776 scope.go:117] "RemoveContainer" containerID="0b881277df6a126ad34bd8b1993d81a7a7c80a1b5531cec40707a0b17ac70562" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.362939 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.398443 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.402901 4776 scope.go:117] "RemoveContainer" containerID="5779391dd89a09e63052867e602f4c12048c07c8fc455061db7d0096bcec5503" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.406846 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.426481 4776 scope.go:117] "RemoveContainer" containerID="0b881277df6a126ad34bd8b1993d81a7a7c80a1b5531cec40707a0b17ac70562" Jan 28 07:11:40 crc kubenswrapper[4776]: E0128 07:11:40.427453 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b881277df6a126ad34bd8b1993d81a7a7c80a1b5531cec40707a0b17ac70562\": container with ID starting with 0b881277df6a126ad34bd8b1993d81a7a7c80a1b5531cec40707a0b17ac70562 not found: ID does not exist" containerID="0b881277df6a126ad34bd8b1993d81a7a7c80a1b5531cec40707a0b17ac70562" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.427507 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b881277df6a126ad34bd8b1993d81a7a7c80a1b5531cec40707a0b17ac70562"} err="failed to get container status \"0b881277df6a126ad34bd8b1993d81a7a7c80a1b5531cec40707a0b17ac70562\": rpc error: code = NotFound desc = could not find container \"0b881277df6a126ad34bd8b1993d81a7a7c80a1b5531cec40707a0b17ac70562\": container with ID starting with 0b881277df6a126ad34bd8b1993d81a7a7c80a1b5531cec40707a0b17ac70562 not found: ID does not exist" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.427534 4776 scope.go:117] "RemoveContainer" containerID="5779391dd89a09e63052867e602f4c12048c07c8fc455061db7d0096bcec5503" Jan 28 07:11:40 crc kubenswrapper[4776]: E0128 07:11:40.429846 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5779391dd89a09e63052867e602f4c12048c07c8fc455061db7d0096bcec5503\": container with ID starting with 5779391dd89a09e63052867e602f4c12048c07c8fc455061db7d0096bcec5503 not found: ID does not exist" containerID="5779391dd89a09e63052867e602f4c12048c07c8fc455061db7d0096bcec5503" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.429874 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5779391dd89a09e63052867e602f4c12048c07c8fc455061db7d0096bcec5503"} err="failed to get container status \"5779391dd89a09e63052867e602f4c12048c07c8fc455061db7d0096bcec5503\": rpc error: code = NotFound desc = could not find container \"5779391dd89a09e63052867e602f4c12048c07c8fc455061db7d0096bcec5503\": container with ID starting with 5779391dd89a09e63052867e602f4c12048c07c8fc455061db7d0096bcec5503 not found: ID does not exist" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.429909 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 07:11:40 crc kubenswrapper[4776]: E0128 07:11:40.430294 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aae9df9-4aee-48fa-aa96-4f93f55be39f" containerName="setup-container" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.430314 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aae9df9-4aee-48fa-aa96-4f93f55be39f" containerName="setup-container" Jan 28 07:11:40 crc kubenswrapper[4776]: E0128 07:11:40.430355 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aae9df9-4aee-48fa-aa96-4f93f55be39f" containerName="rabbitmq" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.430362 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aae9df9-4aee-48fa-aa96-4f93f55be39f" containerName="rabbitmq" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.430521 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aae9df9-4aee-48fa-aa96-4f93f55be39f" containerName="rabbitmq" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.431820 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.435106 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-s9p8g" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.435335 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.435566 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.435680 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.435792 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.435897 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.436032 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.441262 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.538907 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cfd0885-0776-471c-b8f4-afb359e460b2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.539064 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cfd0885-0776-471c-b8f4-afb359e460b2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.539104 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cfd0885-0776-471c-b8f4-afb359e460b2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.539154 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cfd0885-0776-471c-b8f4-afb359e460b2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.539191 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.539328 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cfd0885-0776-471c-b8f4-afb359e460b2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.539359 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjw6c\" (UniqueName: \"kubernetes.io/projected/4cfd0885-0776-471c-b8f4-afb359e460b2-kube-api-access-wjw6c\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.539456 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cfd0885-0776-471c-b8f4-afb359e460b2-config-data\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.539491 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cfd0885-0776-471c-b8f4-afb359e460b2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.539518 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cfd0885-0776-471c-b8f4-afb359e460b2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.539584 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cfd0885-0776-471c-b8f4-afb359e460b2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.640389 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cfd0885-0776-471c-b8f4-afb359e460b2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.640439 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cfd0885-0776-471c-b8f4-afb359e460b2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.640480 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.640590 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cfd0885-0776-471c-b8f4-afb359e460b2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.640624 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjw6c\" (UniqueName: \"kubernetes.io/projected/4cfd0885-0776-471c-b8f4-afb359e460b2-kube-api-access-wjw6c\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.640652 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cfd0885-0776-471c-b8f4-afb359e460b2-config-data\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.640675 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cfd0885-0776-471c-b8f4-afb359e460b2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.640699 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cfd0885-0776-471c-b8f4-afb359e460b2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.640739 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cfd0885-0776-471c-b8f4-afb359e460b2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.640785 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cfd0885-0776-471c-b8f4-afb359e460b2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.640836 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cfd0885-0776-471c-b8f4-afb359e460b2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.641256 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cfd0885-0776-471c-b8f4-afb359e460b2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.641431 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cfd0885-0776-471c-b8f4-afb359e460b2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.641983 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cfd0885-0776-471c-b8f4-afb359e460b2-config-data\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.642230 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.642826 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cfd0885-0776-471c-b8f4-afb359e460b2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.644376 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cfd0885-0776-471c-b8f4-afb359e460b2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.645460 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cfd0885-0776-471c-b8f4-afb359e460b2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.652395 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cfd0885-0776-471c-b8f4-afb359e460b2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.656130 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cfd0885-0776-471c-b8f4-afb359e460b2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.656225 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cfd0885-0776-471c-b8f4-afb359e460b2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.660667 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjw6c\" (UniqueName: \"kubernetes.io/projected/4cfd0885-0776-471c-b8f4-afb359e460b2-kube-api-access-wjw6c\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.698946 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"4cfd0885-0776-471c-b8f4-afb359e460b2\") " pod="openstack/rabbitmq-server-0" Jan 28 07:11:40 crc kubenswrapper[4776]: I0128 07:11:40.749441 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.216466 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.257827 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c544ad4a-db14-419a-b423-435e8416f597-pod-info\") pod \"c544ad4a-db14-419a-b423-435e8416f597\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.257978 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-plugins\") pod \"c544ad4a-db14-419a-b423-435e8416f597\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.258089 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-config-data\") pod \"c544ad4a-db14-419a-b423-435e8416f597\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.258125 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-erlang-cookie\") pod \"c544ad4a-db14-419a-b423-435e8416f597\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.258171 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-tls\") pod \"c544ad4a-db14-419a-b423-435e8416f597\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.258212 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-plugins-conf\") pod \"c544ad4a-db14-419a-b423-435e8416f597\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.258309 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-server-conf\") pod \"c544ad4a-db14-419a-b423-435e8416f597\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.258385 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c544ad4a-db14-419a-b423-435e8416f597-erlang-cookie-secret\") pod \"c544ad4a-db14-419a-b423-435e8416f597\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.258460 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6xlq\" (UniqueName: \"kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-kube-api-access-p6xlq\") pod \"c544ad4a-db14-419a-b423-435e8416f597\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.258537 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-confd\") pod \"c544ad4a-db14-419a-b423-435e8416f597\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.258792 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"c544ad4a-db14-419a-b423-435e8416f597\" (UID: \"c544ad4a-db14-419a-b423-435e8416f597\") " Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.260852 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c544ad4a-db14-419a-b423-435e8416f597" (UID: "c544ad4a-db14-419a-b423-435e8416f597"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.266572 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c544ad4a-db14-419a-b423-435e8416f597" (UID: "c544ad4a-db14-419a-b423-435e8416f597"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.272239 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c544ad4a-db14-419a-b423-435e8416f597-pod-info" (OuterVolumeSpecName: "pod-info") pod "c544ad4a-db14-419a-b423-435e8416f597" (UID: "c544ad4a-db14-419a-b423-435e8416f597"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.275683 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c544ad4a-db14-419a-b423-435e8416f597-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c544ad4a-db14-419a-b423-435e8416f597" (UID: "c544ad4a-db14-419a-b423-435e8416f597"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.277964 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-kube-api-access-p6xlq" (OuterVolumeSpecName: "kube-api-access-p6xlq") pod "c544ad4a-db14-419a-b423-435e8416f597" (UID: "c544ad4a-db14-419a-b423-435e8416f597"). InnerVolumeSpecName "kube-api-access-p6xlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.280885 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c544ad4a-db14-419a-b423-435e8416f597" (UID: "c544ad4a-db14-419a-b423-435e8416f597"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.281722 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "c544ad4a-db14-419a-b423-435e8416f597" (UID: "c544ad4a-db14-419a-b423-435e8416f597"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.313078 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c544ad4a-db14-419a-b423-435e8416f597" (UID: "c544ad4a-db14-419a-b423-435e8416f597"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.336047 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-config-data" (OuterVolumeSpecName: "config-data") pod "c544ad4a-db14-419a-b423-435e8416f597" (UID: "c544ad4a-db14-419a-b423-435e8416f597"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.342147 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-server-conf" (OuterVolumeSpecName: "server-conf") pod "c544ad4a-db14-419a-b423-435e8416f597" (UID: "c544ad4a-db14-419a-b423-435e8416f597"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.347758 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aae9df9-4aee-48fa-aa96-4f93f55be39f" path="/var/lib/kubelet/pods/0aae9df9-4aee-48fa-aa96-4f93f55be39f/volumes" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.361189 4776 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-server-conf\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.361214 4776 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c544ad4a-db14-419a-b423-435e8416f597-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.361227 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6xlq\" (UniqueName: \"kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-kube-api-access-p6xlq\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.361246 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.361256 4776 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c544ad4a-db14-419a-b423-435e8416f597-pod-info\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.361264 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.361274 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.361282 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.361290 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.361298 4776 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c544ad4a-db14-419a-b423-435e8416f597-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.363005 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.373827 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4cfd0885-0776-471c-b8f4-afb359e460b2","Type":"ContainerStarted","Data":"1942b0ae8a436ee5eccca448972522d9f645b0bc9baaf966f4166c582d6cb339"} Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.377053 4776 generic.go:334] "Generic (PLEG): container finished" podID="c544ad4a-db14-419a-b423-435e8416f597" containerID="ad54de427c0b143e1a6651cac9dbacb6c50ef52e4a818312af689c999ed3fc92" exitCode=0 Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.377115 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.377134 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c544ad4a-db14-419a-b423-435e8416f597","Type":"ContainerDied","Data":"ad54de427c0b143e1a6651cac9dbacb6c50ef52e4a818312af689c999ed3fc92"} Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.377431 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c544ad4a-db14-419a-b423-435e8416f597","Type":"ContainerDied","Data":"fb75978b93e00c3c602fc449419650bd95c41c4a2fb0172ca81a4c18a0425fda"} Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.377461 4776 scope.go:117] "RemoveContainer" containerID="ad54de427c0b143e1a6651cac9dbacb6c50ef52e4a818312af689c999ed3fc92" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.388342 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.413280 4776 scope.go:117] "RemoveContainer" containerID="00df2257a649f71215e07ba7ed61ff51fdbcfe1d66c980c13b5ccb5bd6f0511d" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.424321 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c544ad4a-db14-419a-b423-435e8416f597" (UID: "c544ad4a-db14-419a-b423-435e8416f597"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.436900 4776 scope.go:117] "RemoveContainer" containerID="ad54de427c0b143e1a6651cac9dbacb6c50ef52e4a818312af689c999ed3fc92" Jan 28 07:11:41 crc kubenswrapper[4776]: E0128 07:11:41.437525 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad54de427c0b143e1a6651cac9dbacb6c50ef52e4a818312af689c999ed3fc92\": container with ID starting with ad54de427c0b143e1a6651cac9dbacb6c50ef52e4a818312af689c999ed3fc92 not found: ID does not exist" containerID="ad54de427c0b143e1a6651cac9dbacb6c50ef52e4a818312af689c999ed3fc92" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.437565 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad54de427c0b143e1a6651cac9dbacb6c50ef52e4a818312af689c999ed3fc92"} err="failed to get container status \"ad54de427c0b143e1a6651cac9dbacb6c50ef52e4a818312af689c999ed3fc92\": rpc error: code = NotFound desc = could not find container \"ad54de427c0b143e1a6651cac9dbacb6c50ef52e4a818312af689c999ed3fc92\": container with ID starting with ad54de427c0b143e1a6651cac9dbacb6c50ef52e4a818312af689c999ed3fc92 not found: ID does not exist" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.437587 4776 scope.go:117] "RemoveContainer" containerID="00df2257a649f71215e07ba7ed61ff51fdbcfe1d66c980c13b5ccb5bd6f0511d" Jan 28 07:11:41 crc kubenswrapper[4776]: E0128 07:11:41.437894 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00df2257a649f71215e07ba7ed61ff51fdbcfe1d66c980c13b5ccb5bd6f0511d\": container with ID starting with 00df2257a649f71215e07ba7ed61ff51fdbcfe1d66c980c13b5ccb5bd6f0511d not found: ID does not exist" containerID="00df2257a649f71215e07ba7ed61ff51fdbcfe1d66c980c13b5ccb5bd6f0511d" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.437912 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00df2257a649f71215e07ba7ed61ff51fdbcfe1d66c980c13b5ccb5bd6f0511d"} err="failed to get container status \"00df2257a649f71215e07ba7ed61ff51fdbcfe1d66c980c13b5ccb5bd6f0511d\": rpc error: code = NotFound desc = could not find container \"00df2257a649f71215e07ba7ed61ff51fdbcfe1d66c980c13b5ccb5bd6f0511d\": container with ID starting with 00df2257a649f71215e07ba7ed61ff51fdbcfe1d66c980c13b5ccb5bd6f0511d not found: ID does not exist" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.463209 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c544ad4a-db14-419a-b423-435e8416f597-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.463245 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.715227 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.726203 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.737103 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 07:11:41 crc kubenswrapper[4776]: E0128 07:11:41.737505 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c544ad4a-db14-419a-b423-435e8416f597" containerName="setup-container" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.737521 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c544ad4a-db14-419a-b423-435e8416f597" containerName="setup-container" Jan 28 07:11:41 crc kubenswrapper[4776]: E0128 07:11:41.737559 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c544ad4a-db14-419a-b423-435e8416f597" containerName="rabbitmq" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.737569 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c544ad4a-db14-419a-b423-435e8416f597" containerName="rabbitmq" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.737790 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c544ad4a-db14-419a-b423-435e8416f597" containerName="rabbitmq" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.738842 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.742115 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.742464 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.742799 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.742987 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.743110 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-db6sr" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.743373 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.744135 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.759504 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.767958 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.768009 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.768030 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.768050 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.768157 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.768228 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.768328 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.768449 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.768500 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.768674 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.768849 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp56r\" (UniqueName: \"kubernetes.io/projected/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-kube-api-access-cp56r\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.870731 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp56r\" (UniqueName: \"kubernetes.io/projected/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-kube-api-access-cp56r\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.870783 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.870821 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.870840 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.870862 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.870887 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.870914 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.870950 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.870978 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.871000 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.871045 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.871460 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.871962 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.872019 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.872049 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.872436 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.872610 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.875636 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.876665 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.888373 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.893189 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:41 crc kubenswrapper[4776]: I0128 07:11:41.932500 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp56r\" (UniqueName: \"kubernetes.io/projected/34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1-kube-api-access-cp56r\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.034261 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.064987 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.390599 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-cddj2"] Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.395212 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.401319 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.419618 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-cddj2"] Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.484228 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.484457 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-config\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.484612 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7wpf\" (UniqueName: \"kubernetes.io/projected/9a96e757-9d6b-4309-9024-01d78c4e313b-kube-api-access-q7wpf\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.484664 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.484723 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.484795 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.484903 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.513137 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 07:11:42 crc kubenswrapper[4776]: W0128 07:11:42.519078 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34ca2bb8_f3ea_4cca_8def_c0f7feb37ac1.slice/crio-c31683e2d81b43304296bec337c05d1bbb60d56142b2b5643cd0af041e6eafb2 WatchSource:0}: Error finding container c31683e2d81b43304296bec337c05d1bbb60d56142b2b5643cd0af041e6eafb2: Status 404 returned error can't find the container with id c31683e2d81b43304296bec337c05d1bbb60d56142b2b5643cd0af041e6eafb2 Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.585780 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7wpf\" (UniqueName: \"kubernetes.io/projected/9a96e757-9d6b-4309-9024-01d78c4e313b-kube-api-access-q7wpf\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.585827 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.585856 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.585881 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.585921 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.585968 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.586024 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-config\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.586925 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.586990 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-config\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.587270 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.587315 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.587487 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.587537 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.608004 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7wpf\" (UniqueName: \"kubernetes.io/projected/9a96e757-9d6b-4309-9024-01d78c4e313b-kube-api-access-q7wpf\") pod \"dnsmasq-dns-79bd4cc8c9-cddj2\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:42 crc kubenswrapper[4776]: I0128 07:11:42.735801 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:43 crc kubenswrapper[4776]: I0128 07:11:43.180141 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-cddj2"] Jan 28 07:11:43 crc kubenswrapper[4776]: W0128 07:11:43.183302 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a96e757_9d6b_4309_9024_01d78c4e313b.slice/crio-002909cf6654237c2b0a22d0287d2226279a9f31966088f81608d1c8d0e23643 WatchSource:0}: Error finding container 002909cf6654237c2b0a22d0287d2226279a9f31966088f81608d1c8d0e23643: Status 404 returned error can't find the container with id 002909cf6654237c2b0a22d0287d2226279a9f31966088f81608d1c8d0e23643 Jan 28 07:11:43 crc kubenswrapper[4776]: I0128 07:11:43.327804 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c544ad4a-db14-419a-b423-435e8416f597" path="/var/lib/kubelet/pods/c544ad4a-db14-419a-b423-435e8416f597/volumes" Jan 28 07:11:43 crc kubenswrapper[4776]: I0128 07:11:43.433016 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" event={"ID":"9a96e757-9d6b-4309-9024-01d78c4e313b","Type":"ContainerStarted","Data":"68f156f888591d7e4910843e1a70de5c7d7f7f24cc52ad8166cfae57e915105d"} Jan 28 07:11:43 crc kubenswrapper[4776]: I0128 07:11:43.433941 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" event={"ID":"9a96e757-9d6b-4309-9024-01d78c4e313b","Type":"ContainerStarted","Data":"002909cf6654237c2b0a22d0287d2226279a9f31966088f81608d1c8d0e23643"} Jan 28 07:11:43 crc kubenswrapper[4776]: I0128 07:11:43.435133 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4cfd0885-0776-471c-b8f4-afb359e460b2","Type":"ContainerStarted","Data":"d95233b5422e8b1bf725a2397a52c2b82350064b57179d0bc0ca179004995def"} Jan 28 07:11:43 crc kubenswrapper[4776]: I0128 07:11:43.436421 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1","Type":"ContainerStarted","Data":"c31683e2d81b43304296bec337c05d1bbb60d56142b2b5643cd0af041e6eafb2"} Jan 28 07:11:44 crc kubenswrapper[4776]: I0128 07:11:44.450456 4776 generic.go:334] "Generic (PLEG): container finished" podID="9a96e757-9d6b-4309-9024-01d78c4e313b" containerID="68f156f888591d7e4910843e1a70de5c7d7f7f24cc52ad8166cfae57e915105d" exitCode=0 Jan 28 07:11:44 crc kubenswrapper[4776]: I0128 07:11:44.450618 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" event={"ID":"9a96e757-9d6b-4309-9024-01d78c4e313b","Type":"ContainerDied","Data":"68f156f888591d7e4910843e1a70de5c7d7f7f24cc52ad8166cfae57e915105d"} Jan 28 07:11:44 crc kubenswrapper[4776]: I0128 07:11:44.450923 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" event={"ID":"9a96e757-9d6b-4309-9024-01d78c4e313b","Type":"ContainerStarted","Data":"10cc4137eb375c98bfe68124682ea1abe9f883a030fae32c89c209da30720b75"} Jan 28 07:11:44 crc kubenswrapper[4776]: I0128 07:11:44.450958 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:44 crc kubenswrapper[4776]: I0128 07:11:44.453539 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1","Type":"ContainerStarted","Data":"5f2c4277294ed8bc1aaea10e9d584fb5d30b5505768e8819d1f321c6fc9c4c87"} Jan 28 07:11:44 crc kubenswrapper[4776]: I0128 07:11:44.480780 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" podStartSLOduration=2.480764424 podStartE2EDuration="2.480764424s" podCreationTimestamp="2026-01-28 07:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:11:44.472411688 +0000 UTC m=+1275.888071928" watchObservedRunningTime="2026-01-28 07:11:44.480764424 +0000 UTC m=+1275.896424584" Jan 28 07:11:52 crc kubenswrapper[4776]: I0128 07:11:52.737827 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:11:52 crc kubenswrapper[4776]: I0128 07:11:52.842272 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-vcgfw"] Jan 28 07:11:52 crc kubenswrapper[4776]: I0128 07:11:52.842653 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" podUID="2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd" containerName="dnsmasq-dns" containerID="cri-o://be8eb4a207c189783b7980b93eca23cb7839a42fba29d1034023a31736a9300e" gracePeriod=10 Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.004842 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f4d4c4b7-lt79s"] Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.006987 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.020864 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f4d4c4b7-lt79s"] Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.123357 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-dns-svc\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.123759 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-ovsdbserver-sb\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.123842 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-config\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.123885 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpdbn\" (UniqueName: \"kubernetes.io/projected/d235d829-cf03-466a-a77d-27bf20dc03a0-kube-api-access-bpdbn\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.123917 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-openstack-edpm-ipam\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.123935 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-dns-swift-storage-0\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.123958 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-ovsdbserver-nb\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.225880 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-ovsdbserver-sb\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.225968 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-config\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.226016 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpdbn\" (UniqueName: \"kubernetes.io/projected/d235d829-cf03-466a-a77d-27bf20dc03a0-kube-api-access-bpdbn\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.226047 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-openstack-edpm-ipam\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.226066 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-dns-swift-storage-0\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.226086 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-ovsdbserver-nb\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.226128 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-dns-svc\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.226758 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-ovsdbserver-sb\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.226930 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-config\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.227337 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-ovsdbserver-nb\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.227333 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-dns-swift-storage-0\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.227514 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-dns-svc\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.227943 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d235d829-cf03-466a-a77d-27bf20dc03a0-openstack-edpm-ipam\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.256568 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpdbn\" (UniqueName: \"kubernetes.io/projected/d235d829-cf03-466a-a77d-27bf20dc03a0-kube-api-access-bpdbn\") pod \"dnsmasq-dns-f4d4c4b7-lt79s\" (UID: \"d235d829-cf03-466a-a77d-27bf20dc03a0\") " pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.328379 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.362722 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.434381 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-config\") pod \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.435133 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-dns-svc\") pod \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.437557 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-ovsdbserver-nb\") pod \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.437716 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5zp6\" (UniqueName: \"kubernetes.io/projected/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-kube-api-access-w5zp6\") pod \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.437875 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-dns-swift-storage-0\") pod \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.438123 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-ovsdbserver-sb\") pod \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\" (UID: \"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd\") " Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.443673 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-kube-api-access-w5zp6" (OuterVolumeSpecName: "kube-api-access-w5zp6") pod "2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd" (UID: "2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd"). InnerVolumeSpecName "kube-api-access-w5zp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.511365 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd" (UID: "2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.517154 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd" (UID: "2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.518096 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd" (UID: "2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.520147 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-config" (OuterVolumeSpecName: "config") pod "2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd" (UID: "2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.530438 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd" (UID: "2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.540594 4776 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.540625 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.540636 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.540646 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.540655 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.540663 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5zp6\" (UniqueName: \"kubernetes.io/projected/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd-kube-api-access-w5zp6\") on node \"crc\" DevicePath \"\"" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.559191 4776 generic.go:334] "Generic (PLEG): container finished" podID="2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd" containerID="be8eb4a207c189783b7980b93eca23cb7839a42fba29d1034023a31736a9300e" exitCode=0 Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.559232 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" event={"ID":"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd","Type":"ContainerDied","Data":"be8eb4a207c189783b7980b93eca23cb7839a42fba29d1034023a31736a9300e"} Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.559258 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" event={"ID":"2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd","Type":"ContainerDied","Data":"0d8e615026bf861c6241b9fbd58646a8c68d6050d190b4b22ad2ab1bd027f991"} Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.559273 4776 scope.go:117] "RemoveContainer" containerID="be8eb4a207c189783b7980b93eca23cb7839a42fba29d1034023a31736a9300e" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.559399 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-vcgfw" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.581518 4776 scope.go:117] "RemoveContainer" containerID="81815979acbeaf91af059a32c078f28c1c170bdb791e25b762d85d30d1f1caf9" Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.599442 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-vcgfw"] Jan 28 07:11:53 crc kubenswrapper[4776]: I0128 07:11:53.609473 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-vcgfw"] Jan 28 07:11:54 crc kubenswrapper[4776]: I0128 07:11:53.631188 4776 scope.go:117] "RemoveContainer" containerID="be8eb4a207c189783b7980b93eca23cb7839a42fba29d1034023a31736a9300e" Jan 28 07:11:54 crc kubenswrapper[4776]: E0128 07:11:53.632979 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be8eb4a207c189783b7980b93eca23cb7839a42fba29d1034023a31736a9300e\": container with ID starting with be8eb4a207c189783b7980b93eca23cb7839a42fba29d1034023a31736a9300e not found: ID does not exist" containerID="be8eb4a207c189783b7980b93eca23cb7839a42fba29d1034023a31736a9300e" Jan 28 07:11:54 crc kubenswrapper[4776]: I0128 07:11:53.633019 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be8eb4a207c189783b7980b93eca23cb7839a42fba29d1034023a31736a9300e"} err="failed to get container status \"be8eb4a207c189783b7980b93eca23cb7839a42fba29d1034023a31736a9300e\": rpc error: code = NotFound desc = could not find container \"be8eb4a207c189783b7980b93eca23cb7839a42fba29d1034023a31736a9300e\": container with ID starting with be8eb4a207c189783b7980b93eca23cb7839a42fba29d1034023a31736a9300e not found: ID does not exist" Jan 28 07:11:54 crc kubenswrapper[4776]: I0128 07:11:53.633051 4776 scope.go:117] "RemoveContainer" containerID="81815979acbeaf91af059a32c078f28c1c170bdb791e25b762d85d30d1f1caf9" Jan 28 07:11:54 crc kubenswrapper[4776]: E0128 07:11:53.633911 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81815979acbeaf91af059a32c078f28c1c170bdb791e25b762d85d30d1f1caf9\": container with ID starting with 81815979acbeaf91af059a32c078f28c1c170bdb791e25b762d85d30d1f1caf9 not found: ID does not exist" containerID="81815979acbeaf91af059a32c078f28c1c170bdb791e25b762d85d30d1f1caf9" Jan 28 07:11:54 crc kubenswrapper[4776]: I0128 07:11:53.633952 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81815979acbeaf91af059a32c078f28c1c170bdb791e25b762d85d30d1f1caf9"} err="failed to get container status \"81815979acbeaf91af059a32c078f28c1c170bdb791e25b762d85d30d1f1caf9\": rpc error: code = NotFound desc = could not find container \"81815979acbeaf91af059a32c078f28c1c170bdb791e25b762d85d30d1f1caf9\": container with ID starting with 81815979acbeaf91af059a32c078f28c1c170bdb791e25b762d85d30d1f1caf9 not found: ID does not exist" Jan 28 07:11:54 crc kubenswrapper[4776]: W0128 07:11:53.838826 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd235d829_cf03_466a_a77d_27bf20dc03a0.slice/crio-fc509e71bf62ef3eb20d939b70cd801b4409eb4433753ae8170dc6bad06a695b WatchSource:0}: Error finding container fc509e71bf62ef3eb20d939b70cd801b4409eb4433753ae8170dc6bad06a695b: Status 404 returned error can't find the container with id fc509e71bf62ef3eb20d939b70cd801b4409eb4433753ae8170dc6bad06a695b Jan 28 07:11:54 crc kubenswrapper[4776]: I0128 07:11:53.839327 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f4d4c4b7-lt79s"] Jan 28 07:11:54 crc kubenswrapper[4776]: I0128 07:11:54.570566 4776 generic.go:334] "Generic (PLEG): container finished" podID="d235d829-cf03-466a-a77d-27bf20dc03a0" containerID="1681597aec126947acd4c4ed3e3e37dd5d08fa37128ecd0a05faa41ce221a06d" exitCode=0 Jan 28 07:11:54 crc kubenswrapper[4776]: I0128 07:11:54.571602 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" event={"ID":"d235d829-cf03-466a-a77d-27bf20dc03a0","Type":"ContainerDied","Data":"1681597aec126947acd4c4ed3e3e37dd5d08fa37128ecd0a05faa41ce221a06d"} Jan 28 07:11:54 crc kubenswrapper[4776]: I0128 07:11:54.571644 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" event={"ID":"d235d829-cf03-466a-a77d-27bf20dc03a0","Type":"ContainerStarted","Data":"fc509e71bf62ef3eb20d939b70cd801b4409eb4433753ae8170dc6bad06a695b"} Jan 28 07:11:55 crc kubenswrapper[4776]: I0128 07:11:55.372874 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd" path="/var/lib/kubelet/pods/2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd/volumes" Jan 28 07:11:55 crc kubenswrapper[4776]: I0128 07:11:55.588989 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" event={"ID":"d235d829-cf03-466a-a77d-27bf20dc03a0","Type":"ContainerStarted","Data":"7895a37375275d38ea1d840a8bc88e9271f5e5ca4e3035e29fe31d6630567e70"} Jan 28 07:11:55 crc kubenswrapper[4776]: I0128 07:11:55.589214 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:11:55 crc kubenswrapper[4776]: I0128 07:11:55.612334 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" podStartSLOduration=3.612315516 podStartE2EDuration="3.612315516s" podCreationTimestamp="2026-01-28 07:11:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:11:55.610319661 +0000 UTC m=+1287.025979821" watchObservedRunningTime="2026-01-28 07:11:55.612315516 +0000 UTC m=+1287.027975696" Jan 28 07:12:03 crc kubenswrapper[4776]: I0128 07:12:03.330393 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f4d4c4b7-lt79s" Jan 28 07:12:03 crc kubenswrapper[4776]: I0128 07:12:03.417789 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-cddj2"] Jan 28 07:12:03 crc kubenswrapper[4776]: I0128 07:12:03.418059 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" podUID="9a96e757-9d6b-4309-9024-01d78c4e313b" containerName="dnsmasq-dns" containerID="cri-o://10cc4137eb375c98bfe68124682ea1abe9f883a030fae32c89c209da30720b75" gracePeriod=10 Jan 28 07:12:03 crc kubenswrapper[4776]: I0128 07:12:03.682822 4776 generic.go:334] "Generic (PLEG): container finished" podID="9a96e757-9d6b-4309-9024-01d78c4e313b" containerID="10cc4137eb375c98bfe68124682ea1abe9f883a030fae32c89c209da30720b75" exitCode=0 Jan 28 07:12:03 crc kubenswrapper[4776]: I0128 07:12:03.682870 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" event={"ID":"9a96e757-9d6b-4309-9024-01d78c4e313b","Type":"ContainerDied","Data":"10cc4137eb375c98bfe68124682ea1abe9f883a030fae32c89c209da30720b75"} Jan 28 07:12:03 crc kubenswrapper[4776]: I0128 07:12:03.897965 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:12:03 crc kubenswrapper[4776]: I0128 07:12:03.969185 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-ovsdbserver-sb\") pod \"9a96e757-9d6b-4309-9024-01d78c4e313b\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " Jan 28 07:12:03 crc kubenswrapper[4776]: I0128 07:12:03.969265 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-openstack-edpm-ipam\") pod \"9a96e757-9d6b-4309-9024-01d78c4e313b\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " Jan 28 07:12:03 crc kubenswrapper[4776]: I0128 07:12:03.969316 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-dns-swift-storage-0\") pod \"9a96e757-9d6b-4309-9024-01d78c4e313b\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " Jan 28 07:12:03 crc kubenswrapper[4776]: I0128 07:12:03.969350 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-dns-svc\") pod \"9a96e757-9d6b-4309-9024-01d78c4e313b\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " Jan 28 07:12:03 crc kubenswrapper[4776]: I0128 07:12:03.969455 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-config\") pod \"9a96e757-9d6b-4309-9024-01d78c4e313b\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " Jan 28 07:12:03 crc kubenswrapper[4776]: I0128 07:12:03.969487 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7wpf\" (UniqueName: \"kubernetes.io/projected/9a96e757-9d6b-4309-9024-01d78c4e313b-kube-api-access-q7wpf\") pod \"9a96e757-9d6b-4309-9024-01d78c4e313b\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " Jan 28 07:12:03 crc kubenswrapper[4776]: I0128 07:12:03.969520 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-ovsdbserver-nb\") pod \"9a96e757-9d6b-4309-9024-01d78c4e313b\" (UID: \"9a96e757-9d6b-4309-9024-01d78c4e313b\") " Jan 28 07:12:03 crc kubenswrapper[4776]: I0128 07:12:03.981706 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a96e757-9d6b-4309-9024-01d78c4e313b-kube-api-access-q7wpf" (OuterVolumeSpecName: "kube-api-access-q7wpf") pod "9a96e757-9d6b-4309-9024-01d78c4e313b" (UID: "9a96e757-9d6b-4309-9024-01d78c4e313b"). InnerVolumeSpecName "kube-api-access-q7wpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.024348 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9a96e757-9d6b-4309-9024-01d78c4e313b" (UID: "9a96e757-9d6b-4309-9024-01d78c4e313b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.029689 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9a96e757-9d6b-4309-9024-01d78c4e313b" (UID: "9a96e757-9d6b-4309-9024-01d78c4e313b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.031662 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "9a96e757-9d6b-4309-9024-01d78c4e313b" (UID: "9a96e757-9d6b-4309-9024-01d78c4e313b"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.031656 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-config" (OuterVolumeSpecName: "config") pod "9a96e757-9d6b-4309-9024-01d78c4e313b" (UID: "9a96e757-9d6b-4309-9024-01d78c4e313b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.036785 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9a96e757-9d6b-4309-9024-01d78c4e313b" (UID: "9a96e757-9d6b-4309-9024-01d78c4e313b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.065000 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9a96e757-9d6b-4309-9024-01d78c4e313b" (UID: "9a96e757-9d6b-4309-9024-01d78c4e313b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.071590 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.071637 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7wpf\" (UniqueName: \"kubernetes.io/projected/9a96e757-9d6b-4309-9024-01d78c4e313b-kube-api-access-q7wpf\") on node \"crc\" DevicePath \"\"" Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.071651 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.071660 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.071669 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.071677 4776 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.071686 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a96e757-9d6b-4309-9024-01d78c4e313b-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.702844 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" event={"ID":"9a96e757-9d6b-4309-9024-01d78c4e313b","Type":"ContainerDied","Data":"002909cf6654237c2b0a22d0287d2226279a9f31966088f81608d1c8d0e23643"} Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.703199 4776 scope.go:117] "RemoveContainer" containerID="10cc4137eb375c98bfe68124682ea1abe9f883a030fae32c89c209da30720b75" Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.703353 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-cddj2" Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.743470 4776 scope.go:117] "RemoveContainer" containerID="68f156f888591d7e4910843e1a70de5c7d7f7f24cc52ad8166cfae57e915105d" Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.745758 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-cddj2"] Jan 28 07:12:04 crc kubenswrapper[4776]: I0128 07:12:04.768312 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-cddj2"] Jan 28 07:12:05 crc kubenswrapper[4776]: I0128 07:12:05.321648 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a96e757-9d6b-4309-9024-01d78c4e313b" path="/var/lib/kubelet/pods/9a96e757-9d6b-4309-9024-01d78c4e313b/volumes" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.582099 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8"] Jan 28 07:12:15 crc kubenswrapper[4776]: E0128 07:12:15.583242 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd" containerName="dnsmasq-dns" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.583262 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd" containerName="dnsmasq-dns" Jan 28 07:12:15 crc kubenswrapper[4776]: E0128 07:12:15.583283 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a96e757-9d6b-4309-9024-01d78c4e313b" containerName="init" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.583290 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a96e757-9d6b-4309-9024-01d78c4e313b" containerName="init" Jan 28 07:12:15 crc kubenswrapper[4776]: E0128 07:12:15.583308 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a96e757-9d6b-4309-9024-01d78c4e313b" containerName="dnsmasq-dns" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.583318 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a96e757-9d6b-4309-9024-01d78c4e313b" containerName="dnsmasq-dns" Jan 28 07:12:15 crc kubenswrapper[4776]: E0128 07:12:15.583344 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd" containerName="init" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.583351 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd" containerName="init" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.583600 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a96e757-9d6b-4309-9024-01d78c4e313b" containerName="dnsmasq-dns" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.583616 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e3e3c7e-1e20-4ff4-ae6d-19bcd40196bd" containerName="dnsmasq-dns" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.584430 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.587277 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.588043 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cl6qn" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.588253 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.591997 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8"] Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.628846 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.729177 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8\" (UID: \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.729264 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8\" (UID: \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.729287 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8\" (UID: \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.729303 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z786\" (UniqueName: \"kubernetes.io/projected/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-kube-api-access-2z786\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8\" (UID: \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.823575 4776 generic.go:334] "Generic (PLEG): container finished" podID="4cfd0885-0776-471c-b8f4-afb359e460b2" containerID="d95233b5422e8b1bf725a2397a52c2b82350064b57179d0bc0ca179004995def" exitCode=0 Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.823650 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4cfd0885-0776-471c-b8f4-afb359e460b2","Type":"ContainerDied","Data":"d95233b5422e8b1bf725a2397a52c2b82350064b57179d0bc0ca179004995def"} Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.830921 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8\" (UID: \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.831068 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8\" (UID: \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.831109 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8\" (UID: \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.831144 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z786\" (UniqueName: \"kubernetes.io/projected/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-kube-api-access-2z786\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8\" (UID: \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.835172 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8\" (UID: \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.836207 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8\" (UID: \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.836203 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8\" (UID: \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.848520 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z786\" (UniqueName: \"kubernetes.io/projected/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-kube-api-access-2z786\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8\" (UID: \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" Jan 28 07:12:15 crc kubenswrapper[4776]: I0128 07:12:15.954486 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" Jan 28 07:12:16 crc kubenswrapper[4776]: I0128 07:12:16.538124 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8"] Jan 28 07:12:16 crc kubenswrapper[4776]: I0128 07:12:16.843395 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" event={"ID":"496f75bc-3d43-4af8-8bf4-c818f9b4db9d","Type":"ContainerStarted","Data":"5b7c2e44ce89b3257b8db77c1e1ca1f9dc7e3f4b23ba42fccedd68e53a3a006c"} Jan 28 07:12:16 crc kubenswrapper[4776]: I0128 07:12:16.847114 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4cfd0885-0776-471c-b8f4-afb359e460b2","Type":"ContainerStarted","Data":"11501ae5af52aebfd5baf1824f8c7b392702153cf73c81ff3732e38de2efa176"} Jan 28 07:12:16 crc kubenswrapper[4776]: I0128 07:12:16.848445 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 28 07:12:16 crc kubenswrapper[4776]: I0128 07:12:16.877874 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.877838619 podStartE2EDuration="36.877838619s" podCreationTimestamp="2026-01-28 07:11:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:12:16.869467243 +0000 UTC m=+1308.285127393" watchObservedRunningTime="2026-01-28 07:12:16.877838619 +0000 UTC m=+1308.293498779" Jan 28 07:12:17 crc kubenswrapper[4776]: I0128 07:12:17.859929 4776 generic.go:334] "Generic (PLEG): container finished" podID="34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1" containerID="5f2c4277294ed8bc1aaea10e9d584fb5d30b5505768e8819d1f321c6fc9c4c87" exitCode=0 Jan 28 07:12:17 crc kubenswrapper[4776]: I0128 07:12:17.859998 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1","Type":"ContainerDied","Data":"5f2c4277294ed8bc1aaea10e9d584fb5d30b5505768e8819d1f321c6fc9c4c87"} Jan 28 07:12:18 crc kubenswrapper[4776]: I0128 07:12:18.873705 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1","Type":"ContainerStarted","Data":"f4ccc8d3360c996fc20620651a3276a1fc980487d0f8d786bf2f6422320626fe"} Jan 28 07:12:18 crc kubenswrapper[4776]: I0128 07:12:18.874465 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:12:18 crc kubenswrapper[4776]: I0128 07:12:18.910095 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.910074657 podStartE2EDuration="37.910074657s" podCreationTimestamp="2026-01-28 07:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:12:18.899022298 +0000 UTC m=+1310.314682468" watchObservedRunningTime="2026-01-28 07:12:18.910074657 +0000 UTC m=+1310.325734827" Jan 28 07:12:27 crc kubenswrapper[4776]: I0128 07:12:27.983359 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" event={"ID":"496f75bc-3d43-4af8-8bf4-c818f9b4db9d","Type":"ContainerStarted","Data":"95cc80315dae62d8a5888e59e612b011736c8f017636d2a9af5f7a26c43717c4"} Jan 28 07:12:28 crc kubenswrapper[4776]: I0128 07:12:28.012313 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" podStartSLOduration=2.46340592 podStartE2EDuration="13.012289491s" podCreationTimestamp="2026-01-28 07:12:15 +0000 UTC" firstStartedPulling="2026-01-28 07:12:16.536006923 +0000 UTC m=+1307.951667083" lastFinishedPulling="2026-01-28 07:12:27.084890484 +0000 UTC m=+1318.500550654" observedRunningTime="2026-01-28 07:12:28.00375244 +0000 UTC m=+1319.419412630" watchObservedRunningTime="2026-01-28 07:12:28.012289491 +0000 UTC m=+1319.427949661" Jan 28 07:12:30 crc kubenswrapper[4776]: I0128 07:12:30.755828 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 28 07:12:32 crc kubenswrapper[4776]: I0128 07:12:32.068780 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 28 07:12:39 crc kubenswrapper[4776]: I0128 07:12:39.125114 4776 generic.go:334] "Generic (PLEG): container finished" podID="496f75bc-3d43-4af8-8bf4-c818f9b4db9d" containerID="95cc80315dae62d8a5888e59e612b011736c8f017636d2a9af5f7a26c43717c4" exitCode=0 Jan 28 07:12:39 crc kubenswrapper[4776]: I0128 07:12:39.125239 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" event={"ID":"496f75bc-3d43-4af8-8bf4-c818f9b4db9d","Type":"ContainerDied","Data":"95cc80315dae62d8a5888e59e612b011736c8f017636d2a9af5f7a26c43717c4"} Jan 28 07:12:40 crc kubenswrapper[4776]: I0128 07:12:40.573990 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" Jan 28 07:12:40 crc kubenswrapper[4776]: I0128 07:12:40.725394 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z786\" (UniqueName: \"kubernetes.io/projected/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-kube-api-access-2z786\") pod \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\" (UID: \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\") " Jan 28 07:12:40 crc kubenswrapper[4776]: I0128 07:12:40.725472 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-inventory\") pod \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\" (UID: \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\") " Jan 28 07:12:40 crc kubenswrapper[4776]: I0128 07:12:40.725583 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-ssh-key-openstack-edpm-ipam\") pod \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\" (UID: \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\") " Jan 28 07:12:40 crc kubenswrapper[4776]: I0128 07:12:40.725609 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-repo-setup-combined-ca-bundle\") pod \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\" (UID: \"496f75bc-3d43-4af8-8bf4-c818f9b4db9d\") " Jan 28 07:12:40 crc kubenswrapper[4776]: I0128 07:12:40.731988 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "496f75bc-3d43-4af8-8bf4-c818f9b4db9d" (UID: "496f75bc-3d43-4af8-8bf4-c818f9b4db9d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:12:40 crc kubenswrapper[4776]: I0128 07:12:40.733595 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-kube-api-access-2z786" (OuterVolumeSpecName: "kube-api-access-2z786") pod "496f75bc-3d43-4af8-8bf4-c818f9b4db9d" (UID: "496f75bc-3d43-4af8-8bf4-c818f9b4db9d"). InnerVolumeSpecName "kube-api-access-2z786". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:12:40 crc kubenswrapper[4776]: I0128 07:12:40.756590 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-inventory" (OuterVolumeSpecName: "inventory") pod "496f75bc-3d43-4af8-8bf4-c818f9b4db9d" (UID: "496f75bc-3d43-4af8-8bf4-c818f9b4db9d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:12:40 crc kubenswrapper[4776]: I0128 07:12:40.763676 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "496f75bc-3d43-4af8-8bf4-c818f9b4db9d" (UID: "496f75bc-3d43-4af8-8bf4-c818f9b4db9d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:12:40 crc kubenswrapper[4776]: I0128 07:12:40.827738 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:12:40 crc kubenswrapper[4776]: I0128 07:12:40.827778 4776 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:12:40 crc kubenswrapper[4776]: I0128 07:12:40.827789 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z786\" (UniqueName: \"kubernetes.io/projected/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-kube-api-access-2z786\") on node \"crc\" DevicePath \"\"" Jan 28 07:12:40 crc kubenswrapper[4776]: I0128 07:12:40.827799 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/496f75bc-3d43-4af8-8bf4-c818f9b4db9d-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.161904 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" event={"ID":"496f75bc-3d43-4af8-8bf4-c818f9b4db9d","Type":"ContainerDied","Data":"5b7c2e44ce89b3257b8db77c1e1ca1f9dc7e3f4b23ba42fccedd68e53a3a006c"} Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.161961 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b7c2e44ce89b3257b8db77c1e1ca1f9dc7e3f4b23ba42fccedd68e53a3a006c" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.162074 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.256153 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt"] Jan 28 07:12:41 crc kubenswrapper[4776]: E0128 07:12:41.260482 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="496f75bc-3d43-4af8-8bf4-c818f9b4db9d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.260519 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="496f75bc-3d43-4af8-8bf4-c818f9b4db9d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.260858 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="496f75bc-3d43-4af8-8bf4-c818f9b4db9d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.261696 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.268717 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cl6qn" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.269213 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.270098 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.278012 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.301454 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt"] Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.338243 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de16818c-1081-4db9-a329-04c845b7ec51-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m4zrt\" (UID: \"de16818c-1081-4db9-a329-04c845b7ec51\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.338298 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfv5l\" (UniqueName: \"kubernetes.io/projected/de16818c-1081-4db9-a329-04c845b7ec51-kube-api-access-kfv5l\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m4zrt\" (UID: \"de16818c-1081-4db9-a329-04c845b7ec51\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.338374 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de16818c-1081-4db9-a329-04c845b7ec51-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m4zrt\" (UID: \"de16818c-1081-4db9-a329-04c845b7ec51\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.440386 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de16818c-1081-4db9-a329-04c845b7ec51-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m4zrt\" (UID: \"de16818c-1081-4db9-a329-04c845b7ec51\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.440481 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfv5l\" (UniqueName: \"kubernetes.io/projected/de16818c-1081-4db9-a329-04c845b7ec51-kube-api-access-kfv5l\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m4zrt\" (UID: \"de16818c-1081-4db9-a329-04c845b7ec51\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.440655 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de16818c-1081-4db9-a329-04c845b7ec51-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m4zrt\" (UID: \"de16818c-1081-4db9-a329-04c845b7ec51\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.447748 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de16818c-1081-4db9-a329-04c845b7ec51-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m4zrt\" (UID: \"de16818c-1081-4db9-a329-04c845b7ec51\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.447992 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de16818c-1081-4db9-a329-04c845b7ec51-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m4zrt\" (UID: \"de16818c-1081-4db9-a329-04c845b7ec51\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.456510 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfv5l\" (UniqueName: \"kubernetes.io/projected/de16818c-1081-4db9-a329-04c845b7ec51-kube-api-access-kfv5l\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m4zrt\" (UID: \"de16818c-1081-4db9-a329-04c845b7ec51\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" Jan 28 07:12:41 crc kubenswrapper[4776]: I0128 07:12:41.590105 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" Jan 28 07:12:42 crc kubenswrapper[4776]: I0128 07:12:42.262858 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt"] Jan 28 07:12:43 crc kubenswrapper[4776]: I0128 07:12:43.205503 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" event={"ID":"de16818c-1081-4db9-a329-04c845b7ec51","Type":"ContainerStarted","Data":"e985e3e138b1aa88bd69d89beb31ecbb1e93b0c2894be6e03e2efaed5a486e9f"} Jan 28 07:12:43 crc kubenswrapper[4776]: I0128 07:12:43.206197 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" event={"ID":"de16818c-1081-4db9-a329-04c845b7ec51","Type":"ContainerStarted","Data":"bd30ec93909e57a85965eb97f5eec328332a3668e07379b8bdd7b7f1f5a50994"} Jan 28 07:12:43 crc kubenswrapper[4776]: I0128 07:12:43.234989 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" podStartSLOduration=1.7949714110000001 podStartE2EDuration="2.234966729s" podCreationTimestamp="2026-01-28 07:12:41 +0000 UTC" firstStartedPulling="2026-01-28 07:12:42.275066584 +0000 UTC m=+1333.690726744" lastFinishedPulling="2026-01-28 07:12:42.715061892 +0000 UTC m=+1334.130722062" observedRunningTime="2026-01-28 07:12:43.225263256 +0000 UTC m=+1334.640923426" watchObservedRunningTime="2026-01-28 07:12:43.234966729 +0000 UTC m=+1334.650626899" Jan 28 07:12:46 crc kubenswrapper[4776]: I0128 07:12:46.248432 4776 generic.go:334] "Generic (PLEG): container finished" podID="de16818c-1081-4db9-a329-04c845b7ec51" containerID="e985e3e138b1aa88bd69d89beb31ecbb1e93b0c2894be6e03e2efaed5a486e9f" exitCode=0 Jan 28 07:12:46 crc kubenswrapper[4776]: I0128 07:12:46.248733 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" event={"ID":"de16818c-1081-4db9-a329-04c845b7ec51","Type":"ContainerDied","Data":"e985e3e138b1aa88bd69d89beb31ecbb1e93b0c2894be6e03e2efaed5a486e9f"} Jan 28 07:12:47 crc kubenswrapper[4776]: I0128 07:12:47.784479 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" Jan 28 07:12:47 crc kubenswrapper[4776]: I0128 07:12:47.880316 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de16818c-1081-4db9-a329-04c845b7ec51-ssh-key-openstack-edpm-ipam\") pod \"de16818c-1081-4db9-a329-04c845b7ec51\" (UID: \"de16818c-1081-4db9-a329-04c845b7ec51\") " Jan 28 07:12:47 crc kubenswrapper[4776]: I0128 07:12:47.880372 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfv5l\" (UniqueName: \"kubernetes.io/projected/de16818c-1081-4db9-a329-04c845b7ec51-kube-api-access-kfv5l\") pod \"de16818c-1081-4db9-a329-04c845b7ec51\" (UID: \"de16818c-1081-4db9-a329-04c845b7ec51\") " Jan 28 07:12:47 crc kubenswrapper[4776]: I0128 07:12:47.880675 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de16818c-1081-4db9-a329-04c845b7ec51-inventory\") pod \"de16818c-1081-4db9-a329-04c845b7ec51\" (UID: \"de16818c-1081-4db9-a329-04c845b7ec51\") " Jan 28 07:12:47 crc kubenswrapper[4776]: I0128 07:12:47.887956 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de16818c-1081-4db9-a329-04c845b7ec51-kube-api-access-kfv5l" (OuterVolumeSpecName: "kube-api-access-kfv5l") pod "de16818c-1081-4db9-a329-04c845b7ec51" (UID: "de16818c-1081-4db9-a329-04c845b7ec51"). InnerVolumeSpecName "kube-api-access-kfv5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:12:47 crc kubenswrapper[4776]: I0128 07:12:47.922442 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de16818c-1081-4db9-a329-04c845b7ec51-inventory" (OuterVolumeSpecName: "inventory") pod "de16818c-1081-4db9-a329-04c845b7ec51" (UID: "de16818c-1081-4db9-a329-04c845b7ec51"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:12:47 crc kubenswrapper[4776]: I0128 07:12:47.930311 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de16818c-1081-4db9-a329-04c845b7ec51-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "de16818c-1081-4db9-a329-04c845b7ec51" (UID: "de16818c-1081-4db9-a329-04c845b7ec51"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:12:47 crc kubenswrapper[4776]: I0128 07:12:47.983238 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de16818c-1081-4db9-a329-04c845b7ec51-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 07:12:47 crc kubenswrapper[4776]: I0128 07:12:47.983626 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de16818c-1081-4db9-a329-04c845b7ec51-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:12:47 crc kubenswrapper[4776]: I0128 07:12:47.983644 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfv5l\" (UniqueName: \"kubernetes.io/projected/de16818c-1081-4db9-a329-04c845b7ec51-kube-api-access-kfv5l\") on node \"crc\" DevicePath \"\"" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.286414 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" event={"ID":"de16818c-1081-4db9-a329-04c845b7ec51","Type":"ContainerDied","Data":"bd30ec93909e57a85965eb97f5eec328332a3668e07379b8bdd7b7f1f5a50994"} Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.286745 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd30ec93909e57a85965eb97f5eec328332a3668e07379b8bdd7b7f1f5a50994" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.286475 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m4zrt" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.359793 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb"] Jan 28 07:12:48 crc kubenswrapper[4776]: E0128 07:12:48.360230 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de16818c-1081-4db9-a329-04c845b7ec51" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.360252 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="de16818c-1081-4db9-a329-04c845b7ec51" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.360463 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="de16818c-1081-4db9-a329-04c845b7ec51" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.361117 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.363524 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.364157 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.364754 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cl6qn" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.366871 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.384767 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb"] Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.493283 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb\" (UID: \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.493348 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb\" (UID: \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.493381 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb\" (UID: \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.493711 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x58tk\" (UniqueName: \"kubernetes.io/projected/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-kube-api-access-x58tk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb\" (UID: \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.595945 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb\" (UID: \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.596045 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb\" (UID: \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.596088 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb\" (UID: \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.596195 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x58tk\" (UniqueName: \"kubernetes.io/projected/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-kube-api-access-x58tk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb\" (UID: \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.600160 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb\" (UID: \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.600728 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb\" (UID: \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.604438 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb\" (UID: \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.627973 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x58tk\" (UniqueName: \"kubernetes.io/projected/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-kube-api-access-x58tk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb\" (UID: \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" Jan 28 07:12:48 crc kubenswrapper[4776]: I0128 07:12:48.686014 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" Jan 28 07:12:49 crc kubenswrapper[4776]: I0128 07:12:49.301397 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb"] Jan 28 07:12:49 crc kubenswrapper[4776]: W0128 07:12:49.304093 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a3467e9_b4e8_40f9_8e96_3615aa7248ca.slice/crio-2fc103fd2d133d4c58becf9da23d0de6befaee317daf99bbe311492a4360c7e6 WatchSource:0}: Error finding container 2fc103fd2d133d4c58becf9da23d0de6befaee317daf99bbe311492a4360c7e6: Status 404 returned error can't find the container with id 2fc103fd2d133d4c58becf9da23d0de6befaee317daf99bbe311492a4360c7e6 Jan 28 07:12:50 crc kubenswrapper[4776]: I0128 07:12:50.316486 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" event={"ID":"9a3467e9-b4e8-40f9-8e96-3615aa7248ca","Type":"ContainerStarted","Data":"b3995b507a3fb5531b94b696cc36710cd03299369d640414cabf5a0ab72f1135"} Jan 28 07:12:50 crc kubenswrapper[4776]: I0128 07:12:50.317026 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" event={"ID":"9a3467e9-b4e8-40f9-8e96-3615aa7248ca","Type":"ContainerStarted","Data":"2fc103fd2d133d4c58becf9da23d0de6befaee317daf99bbe311492a4360c7e6"} Jan 28 07:12:50 crc kubenswrapper[4776]: I0128 07:12:50.340061 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" podStartSLOduration=1.955545286 podStartE2EDuration="2.340028294s" podCreationTimestamp="2026-01-28 07:12:48 +0000 UTC" firstStartedPulling="2026-01-28 07:12:49.307000763 +0000 UTC m=+1340.722660943" lastFinishedPulling="2026-01-28 07:12:49.691483741 +0000 UTC m=+1341.107143951" observedRunningTime="2026-01-28 07:12:50.335117541 +0000 UTC m=+1341.750777741" watchObservedRunningTime="2026-01-28 07:12:50.340028294 +0000 UTC m=+1341.755688464" Jan 28 07:13:45 crc kubenswrapper[4776]: I0128 07:13:45.504501 4776 scope.go:117] "RemoveContainer" containerID="08a95b6bd6ee8af42cfb318de3d9ecce10ec1906ab11a18056ad3c82c763e397" Jan 28 07:13:45 crc kubenswrapper[4776]: I0128 07:13:45.554327 4776 scope.go:117] "RemoveContainer" containerID="a0d7b5f7851761c9bbfecc56e844c0e290ef65346011ef326ecf8dae468a6995" Jan 28 07:13:45 crc kubenswrapper[4776]: I0128 07:13:45.638329 4776 scope.go:117] "RemoveContainer" containerID="81684da9b9cefbdb2d3933b15b710c4e87d68943bb5b190940da8cfe42b7cdd4" Jan 28 07:13:45 crc kubenswrapper[4776]: I0128 07:13:45.694704 4776 scope.go:117] "RemoveContainer" containerID="47644434cf328c5ab572708b384e78051d8b9c97b2853d2584103061f2009d63" Jan 28 07:13:45 crc kubenswrapper[4776]: I0128 07:13:45.737194 4776 scope.go:117] "RemoveContainer" containerID="f6e78ddf1d896fb0f6b71d14e9adc6d134ce7ec5db4388a24ae61e954d2df014" Jan 28 07:13:45 crc kubenswrapper[4776]: I0128 07:13:45.775824 4776 scope.go:117] "RemoveContainer" containerID="b7b5ebe2d3e811df95cce47701109c338d599e4ddf62916a3520cd5be0d9bf38" Jan 28 07:13:45 crc kubenswrapper[4776]: I0128 07:13:45.816526 4776 scope.go:117] "RemoveContainer" containerID="27b39f6875088c8f69a97f8aaf7dd625d73dbbec0abbf0730e71e0872238c7d8" Jan 28 07:14:03 crc kubenswrapper[4776]: I0128 07:14:03.851980 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:14:03 crc kubenswrapper[4776]: I0128 07:14:03.853352 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:14:33 crc kubenswrapper[4776]: I0128 07:14:33.103401 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cnlmq"] Jan 28 07:14:33 crc kubenswrapper[4776]: I0128 07:14:33.115168 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnlmq" Jan 28 07:14:33 crc kubenswrapper[4776]: I0128 07:14:33.123200 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cnlmq"] Jan 28 07:14:33 crc kubenswrapper[4776]: I0128 07:14:33.245239 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947f31a2-ccff-490d-8541-c38475411a65-catalog-content\") pod \"community-operators-cnlmq\" (UID: \"947f31a2-ccff-490d-8541-c38475411a65\") " pod="openshift-marketplace/community-operators-cnlmq" Jan 28 07:14:33 crc kubenswrapper[4776]: I0128 07:14:33.245328 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947f31a2-ccff-490d-8541-c38475411a65-utilities\") pod \"community-operators-cnlmq\" (UID: \"947f31a2-ccff-490d-8541-c38475411a65\") " pod="openshift-marketplace/community-operators-cnlmq" Jan 28 07:14:33 crc kubenswrapper[4776]: I0128 07:14:33.245374 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl42l\" (UniqueName: \"kubernetes.io/projected/947f31a2-ccff-490d-8541-c38475411a65-kube-api-access-pl42l\") pod \"community-operators-cnlmq\" (UID: \"947f31a2-ccff-490d-8541-c38475411a65\") " pod="openshift-marketplace/community-operators-cnlmq" Jan 28 07:14:33 crc kubenswrapper[4776]: I0128 07:14:33.347909 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947f31a2-ccff-490d-8541-c38475411a65-catalog-content\") pod \"community-operators-cnlmq\" (UID: \"947f31a2-ccff-490d-8541-c38475411a65\") " pod="openshift-marketplace/community-operators-cnlmq" Jan 28 07:14:33 crc kubenswrapper[4776]: I0128 07:14:33.348021 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947f31a2-ccff-490d-8541-c38475411a65-utilities\") pod \"community-operators-cnlmq\" (UID: \"947f31a2-ccff-490d-8541-c38475411a65\") " pod="openshift-marketplace/community-operators-cnlmq" Jan 28 07:14:33 crc kubenswrapper[4776]: I0128 07:14:33.348091 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl42l\" (UniqueName: \"kubernetes.io/projected/947f31a2-ccff-490d-8541-c38475411a65-kube-api-access-pl42l\") pod \"community-operators-cnlmq\" (UID: \"947f31a2-ccff-490d-8541-c38475411a65\") " pod="openshift-marketplace/community-operators-cnlmq" Jan 28 07:14:33 crc kubenswrapper[4776]: I0128 07:14:33.348498 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947f31a2-ccff-490d-8541-c38475411a65-catalog-content\") pod \"community-operators-cnlmq\" (UID: \"947f31a2-ccff-490d-8541-c38475411a65\") " pod="openshift-marketplace/community-operators-cnlmq" Jan 28 07:14:33 crc kubenswrapper[4776]: I0128 07:14:33.348625 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947f31a2-ccff-490d-8541-c38475411a65-utilities\") pod \"community-operators-cnlmq\" (UID: \"947f31a2-ccff-490d-8541-c38475411a65\") " pod="openshift-marketplace/community-operators-cnlmq" Jan 28 07:14:33 crc kubenswrapper[4776]: I0128 07:14:33.370358 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl42l\" (UniqueName: \"kubernetes.io/projected/947f31a2-ccff-490d-8541-c38475411a65-kube-api-access-pl42l\") pod \"community-operators-cnlmq\" (UID: \"947f31a2-ccff-490d-8541-c38475411a65\") " pod="openshift-marketplace/community-operators-cnlmq" Jan 28 07:14:33 crc kubenswrapper[4776]: I0128 07:14:33.459201 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnlmq" Jan 28 07:14:33 crc kubenswrapper[4776]: I0128 07:14:33.851874 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:14:33 crc kubenswrapper[4776]: I0128 07:14:33.852314 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:14:34 crc kubenswrapper[4776]: I0128 07:14:34.010616 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cnlmq"] Jan 28 07:14:34 crc kubenswrapper[4776]: I0128 07:14:34.555682 4776 generic.go:334] "Generic (PLEG): container finished" podID="947f31a2-ccff-490d-8541-c38475411a65" containerID="f1a58260e9406d30cf43f5dd9770677af5099970f1f4ae11316eafb82c427e69" exitCode=0 Jan 28 07:14:34 crc kubenswrapper[4776]: I0128 07:14:34.555825 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnlmq" event={"ID":"947f31a2-ccff-490d-8541-c38475411a65","Type":"ContainerDied","Data":"f1a58260e9406d30cf43f5dd9770677af5099970f1f4ae11316eafb82c427e69"} Jan 28 07:14:34 crc kubenswrapper[4776]: I0128 07:14:34.556203 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnlmq" event={"ID":"947f31a2-ccff-490d-8541-c38475411a65","Type":"ContainerStarted","Data":"37129e9ea0f2379a6e780c18ce911f2e5bef856fa2bf90469a9974f0fb25fd0f"} Jan 28 07:14:35 crc kubenswrapper[4776]: I0128 07:14:35.571059 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnlmq" event={"ID":"947f31a2-ccff-490d-8541-c38475411a65","Type":"ContainerStarted","Data":"a2df84f23b51b939ad2f8604ebb71f4bb15ce84cbb75bf649779deb2ea133595"} Jan 28 07:14:36 crc kubenswrapper[4776]: I0128 07:14:36.582380 4776 generic.go:334] "Generic (PLEG): container finished" podID="947f31a2-ccff-490d-8541-c38475411a65" containerID="a2df84f23b51b939ad2f8604ebb71f4bb15ce84cbb75bf649779deb2ea133595" exitCode=0 Jan 28 07:14:36 crc kubenswrapper[4776]: I0128 07:14:36.582458 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnlmq" event={"ID":"947f31a2-ccff-490d-8541-c38475411a65","Type":"ContainerDied","Data":"a2df84f23b51b939ad2f8604ebb71f4bb15ce84cbb75bf649779deb2ea133595"} Jan 28 07:14:37 crc kubenswrapper[4776]: I0128 07:14:37.593907 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnlmq" event={"ID":"947f31a2-ccff-490d-8541-c38475411a65","Type":"ContainerStarted","Data":"99efcf5012891cef1d507bd5d08964e55d4fd4aea1ac7143394372896780e20b"} Jan 28 07:14:37 crc kubenswrapper[4776]: I0128 07:14:37.616613 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cnlmq" podStartSLOduration=2.205197273 podStartE2EDuration="4.616597695s" podCreationTimestamp="2026-01-28 07:14:33 +0000 UTC" firstStartedPulling="2026-01-28 07:14:34.560194667 +0000 UTC m=+1445.975854867" lastFinishedPulling="2026-01-28 07:14:36.971595119 +0000 UTC m=+1448.387255289" observedRunningTime="2026-01-28 07:14:37.613031579 +0000 UTC m=+1449.028691739" watchObservedRunningTime="2026-01-28 07:14:37.616597695 +0000 UTC m=+1449.032257855" Jan 28 07:14:43 crc kubenswrapper[4776]: I0128 07:14:43.459779 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cnlmq" Jan 28 07:14:43 crc kubenswrapper[4776]: I0128 07:14:43.460332 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cnlmq" Jan 28 07:14:43 crc kubenswrapper[4776]: I0128 07:14:43.544529 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cnlmq" Jan 28 07:14:43 crc kubenswrapper[4776]: I0128 07:14:43.724125 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cnlmq" Jan 28 07:14:43 crc kubenswrapper[4776]: I0128 07:14:43.794238 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cnlmq"] Jan 28 07:14:45 crc kubenswrapper[4776]: I0128 07:14:45.686044 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cnlmq" podUID="947f31a2-ccff-490d-8541-c38475411a65" containerName="registry-server" containerID="cri-o://99efcf5012891cef1d507bd5d08964e55d4fd4aea1ac7143394372896780e20b" gracePeriod=2 Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.026541 4776 scope.go:117] "RemoveContainer" containerID="9a0429a49f30cfbbb0fc0eb60c73fc15c585b0a18ffca045b308134e7963123f" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.067964 4776 scope.go:117] "RemoveContainer" containerID="45221c24d2c63a782c9a8645d56674fbeea6852cbaa28c94f4755bbd5e9aa5c5" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.110357 4776 scope.go:117] "RemoveContainer" containerID="a9c5000a28f314609eadfc4c02e1cc1c08f184777067e28b1ad5b5e2b1d2b750" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.134874 4776 scope.go:117] "RemoveContainer" containerID="ca5a8b3804762f9404ba6e324e6030336b870c0929fee944732f4bce4f7e8bd2" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.238534 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnlmq" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.246799 4776 scope.go:117] "RemoveContainer" containerID="251ba1f110367388593330d151d1c88c4d914c55af6a2fd2b3c04e541b32ef84" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.348257 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947f31a2-ccff-490d-8541-c38475411a65-catalog-content\") pod \"947f31a2-ccff-490d-8541-c38475411a65\" (UID: \"947f31a2-ccff-490d-8541-c38475411a65\") " Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.348399 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947f31a2-ccff-490d-8541-c38475411a65-utilities\") pod \"947f31a2-ccff-490d-8541-c38475411a65\" (UID: \"947f31a2-ccff-490d-8541-c38475411a65\") " Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.348478 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl42l\" (UniqueName: \"kubernetes.io/projected/947f31a2-ccff-490d-8541-c38475411a65-kube-api-access-pl42l\") pod \"947f31a2-ccff-490d-8541-c38475411a65\" (UID: \"947f31a2-ccff-490d-8541-c38475411a65\") " Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.350627 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/947f31a2-ccff-490d-8541-c38475411a65-utilities" (OuterVolumeSpecName: "utilities") pod "947f31a2-ccff-490d-8541-c38475411a65" (UID: "947f31a2-ccff-490d-8541-c38475411a65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.353912 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/947f31a2-ccff-490d-8541-c38475411a65-kube-api-access-pl42l" (OuterVolumeSpecName: "kube-api-access-pl42l") pod "947f31a2-ccff-490d-8541-c38475411a65" (UID: "947f31a2-ccff-490d-8541-c38475411a65"). InnerVolumeSpecName "kube-api-access-pl42l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.395473 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/947f31a2-ccff-490d-8541-c38475411a65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "947f31a2-ccff-490d-8541-c38475411a65" (UID: "947f31a2-ccff-490d-8541-c38475411a65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.451652 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947f31a2-ccff-490d-8541-c38475411a65-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.451729 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947f31a2-ccff-490d-8541-c38475411a65-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.451743 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl42l\" (UniqueName: \"kubernetes.io/projected/947f31a2-ccff-490d-8541-c38475411a65-kube-api-access-pl42l\") on node \"crc\" DevicePath \"\"" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.704788 4776 generic.go:334] "Generic (PLEG): container finished" podID="947f31a2-ccff-490d-8541-c38475411a65" containerID="99efcf5012891cef1d507bd5d08964e55d4fd4aea1ac7143394372896780e20b" exitCode=0 Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.704918 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnlmq" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.704923 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnlmq" event={"ID":"947f31a2-ccff-490d-8541-c38475411a65","Type":"ContainerDied","Data":"99efcf5012891cef1d507bd5d08964e55d4fd4aea1ac7143394372896780e20b"} Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.705381 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnlmq" event={"ID":"947f31a2-ccff-490d-8541-c38475411a65","Type":"ContainerDied","Data":"37129e9ea0f2379a6e780c18ce911f2e5bef856fa2bf90469a9974f0fb25fd0f"} Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.705416 4776 scope.go:117] "RemoveContainer" containerID="99efcf5012891cef1d507bd5d08964e55d4fd4aea1ac7143394372896780e20b" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.735453 4776 scope.go:117] "RemoveContainer" containerID="a2df84f23b51b939ad2f8604ebb71f4bb15ce84cbb75bf649779deb2ea133595" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.782429 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cnlmq"] Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.789718 4776 scope.go:117] "RemoveContainer" containerID="f1a58260e9406d30cf43f5dd9770677af5099970f1f4ae11316eafb82c427e69" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.795704 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cnlmq"] Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.821955 4776 scope.go:117] "RemoveContainer" containerID="99efcf5012891cef1d507bd5d08964e55d4fd4aea1ac7143394372896780e20b" Jan 28 07:14:46 crc kubenswrapper[4776]: E0128 07:14:46.822471 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99efcf5012891cef1d507bd5d08964e55d4fd4aea1ac7143394372896780e20b\": container with ID starting with 99efcf5012891cef1d507bd5d08964e55d4fd4aea1ac7143394372896780e20b not found: ID does not exist" containerID="99efcf5012891cef1d507bd5d08964e55d4fd4aea1ac7143394372896780e20b" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.822512 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99efcf5012891cef1d507bd5d08964e55d4fd4aea1ac7143394372896780e20b"} err="failed to get container status \"99efcf5012891cef1d507bd5d08964e55d4fd4aea1ac7143394372896780e20b\": rpc error: code = NotFound desc = could not find container \"99efcf5012891cef1d507bd5d08964e55d4fd4aea1ac7143394372896780e20b\": container with ID starting with 99efcf5012891cef1d507bd5d08964e55d4fd4aea1ac7143394372896780e20b not found: ID does not exist" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.822539 4776 scope.go:117] "RemoveContainer" containerID="a2df84f23b51b939ad2f8604ebb71f4bb15ce84cbb75bf649779deb2ea133595" Jan 28 07:14:46 crc kubenswrapper[4776]: E0128 07:14:46.823032 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2df84f23b51b939ad2f8604ebb71f4bb15ce84cbb75bf649779deb2ea133595\": container with ID starting with a2df84f23b51b939ad2f8604ebb71f4bb15ce84cbb75bf649779deb2ea133595 not found: ID does not exist" containerID="a2df84f23b51b939ad2f8604ebb71f4bb15ce84cbb75bf649779deb2ea133595" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.823072 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2df84f23b51b939ad2f8604ebb71f4bb15ce84cbb75bf649779deb2ea133595"} err="failed to get container status \"a2df84f23b51b939ad2f8604ebb71f4bb15ce84cbb75bf649779deb2ea133595\": rpc error: code = NotFound desc = could not find container \"a2df84f23b51b939ad2f8604ebb71f4bb15ce84cbb75bf649779deb2ea133595\": container with ID starting with a2df84f23b51b939ad2f8604ebb71f4bb15ce84cbb75bf649779deb2ea133595 not found: ID does not exist" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.823100 4776 scope.go:117] "RemoveContainer" containerID="f1a58260e9406d30cf43f5dd9770677af5099970f1f4ae11316eafb82c427e69" Jan 28 07:14:46 crc kubenswrapper[4776]: E0128 07:14:46.823459 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1a58260e9406d30cf43f5dd9770677af5099970f1f4ae11316eafb82c427e69\": container with ID starting with f1a58260e9406d30cf43f5dd9770677af5099970f1f4ae11316eafb82c427e69 not found: ID does not exist" containerID="f1a58260e9406d30cf43f5dd9770677af5099970f1f4ae11316eafb82c427e69" Jan 28 07:14:46 crc kubenswrapper[4776]: I0128 07:14:46.823525 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1a58260e9406d30cf43f5dd9770677af5099970f1f4ae11316eafb82c427e69"} err="failed to get container status \"f1a58260e9406d30cf43f5dd9770677af5099970f1f4ae11316eafb82c427e69\": rpc error: code = NotFound desc = could not find container \"f1a58260e9406d30cf43f5dd9770677af5099970f1f4ae11316eafb82c427e69\": container with ID starting with f1a58260e9406d30cf43f5dd9770677af5099970f1f4ae11316eafb82c427e69 not found: ID does not exist" Jan 28 07:14:47 crc kubenswrapper[4776]: I0128 07:14:47.327080 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="947f31a2-ccff-490d-8541-c38475411a65" path="/var/lib/kubelet/pods/947f31a2-ccff-490d-8541-c38475411a65/volumes" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.145381 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5"] Jan 28 07:15:00 crc kubenswrapper[4776]: E0128 07:15:00.147346 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947f31a2-ccff-490d-8541-c38475411a65" containerName="extract-utilities" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.147429 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="947f31a2-ccff-490d-8541-c38475411a65" containerName="extract-utilities" Jan 28 07:15:00 crc kubenswrapper[4776]: E0128 07:15:00.147501 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947f31a2-ccff-490d-8541-c38475411a65" containerName="registry-server" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.147592 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="947f31a2-ccff-490d-8541-c38475411a65" containerName="registry-server" Jan 28 07:15:00 crc kubenswrapper[4776]: E0128 07:15:00.147648 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947f31a2-ccff-490d-8541-c38475411a65" containerName="extract-content" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.147696 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="947f31a2-ccff-490d-8541-c38475411a65" containerName="extract-content" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.147915 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="947f31a2-ccff-490d-8541-c38475411a65" containerName="registry-server" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.150057 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.152571 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.154821 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.169246 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5"] Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.246936 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mc76\" (UniqueName: \"kubernetes.io/projected/ac762e3d-99f1-442c-be4c-9b31c622a055-kube-api-access-6mc76\") pod \"collect-profiles-29493075-59vj5\" (UID: \"ac762e3d-99f1-442c-be4c-9b31c622a055\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.247106 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac762e3d-99f1-442c-be4c-9b31c622a055-secret-volume\") pod \"collect-profiles-29493075-59vj5\" (UID: \"ac762e3d-99f1-442c-be4c-9b31c622a055\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.247187 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac762e3d-99f1-442c-be4c-9b31c622a055-config-volume\") pod \"collect-profiles-29493075-59vj5\" (UID: \"ac762e3d-99f1-442c-be4c-9b31c622a055\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.349078 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mc76\" (UniqueName: \"kubernetes.io/projected/ac762e3d-99f1-442c-be4c-9b31c622a055-kube-api-access-6mc76\") pod \"collect-profiles-29493075-59vj5\" (UID: \"ac762e3d-99f1-442c-be4c-9b31c622a055\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.349261 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac762e3d-99f1-442c-be4c-9b31c622a055-secret-volume\") pod \"collect-profiles-29493075-59vj5\" (UID: \"ac762e3d-99f1-442c-be4c-9b31c622a055\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.349310 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac762e3d-99f1-442c-be4c-9b31c622a055-config-volume\") pod \"collect-profiles-29493075-59vj5\" (UID: \"ac762e3d-99f1-442c-be4c-9b31c622a055\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.360363 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac762e3d-99f1-442c-be4c-9b31c622a055-config-volume\") pod \"collect-profiles-29493075-59vj5\" (UID: \"ac762e3d-99f1-442c-be4c-9b31c622a055\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.360880 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac762e3d-99f1-442c-be4c-9b31c622a055-secret-volume\") pod \"collect-profiles-29493075-59vj5\" (UID: \"ac762e3d-99f1-442c-be4c-9b31c622a055\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.370205 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mc76\" (UniqueName: \"kubernetes.io/projected/ac762e3d-99f1-442c-be4c-9b31c622a055-kube-api-access-6mc76\") pod \"collect-profiles-29493075-59vj5\" (UID: \"ac762e3d-99f1-442c-be4c-9b31c622a055\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.526893 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5" Jan 28 07:15:00 crc kubenswrapper[4776]: I0128 07:15:00.998850 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5"] Jan 28 07:15:01 crc kubenswrapper[4776]: I0128 07:15:01.891530 4776 generic.go:334] "Generic (PLEG): container finished" podID="ac762e3d-99f1-442c-be4c-9b31c622a055" containerID="48790fb9c097dcd680e80dc39c870ebb81ebec4eca6685eedfd1b9cde58445c0" exitCode=0 Jan 28 07:15:01 crc kubenswrapper[4776]: I0128 07:15:01.891818 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5" event={"ID":"ac762e3d-99f1-442c-be4c-9b31c622a055","Type":"ContainerDied","Data":"48790fb9c097dcd680e80dc39c870ebb81ebec4eca6685eedfd1b9cde58445c0"} Jan 28 07:15:01 crc kubenswrapper[4776]: I0128 07:15:01.891906 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5" event={"ID":"ac762e3d-99f1-442c-be4c-9b31c622a055","Type":"ContainerStarted","Data":"1e30c1e22a86a0447b4b1523ed82dcb56416de34cd6d5b0c4b686bdeb760fbd6"} Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.271505 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5" Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.411426 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac762e3d-99f1-442c-be4c-9b31c622a055-secret-volume\") pod \"ac762e3d-99f1-442c-be4c-9b31c622a055\" (UID: \"ac762e3d-99f1-442c-be4c-9b31c622a055\") " Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.411584 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac762e3d-99f1-442c-be4c-9b31c622a055-config-volume\") pod \"ac762e3d-99f1-442c-be4c-9b31c622a055\" (UID: \"ac762e3d-99f1-442c-be4c-9b31c622a055\") " Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.412473 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac762e3d-99f1-442c-be4c-9b31c622a055-config-volume" (OuterVolumeSpecName: "config-volume") pod "ac762e3d-99f1-442c-be4c-9b31c622a055" (UID: "ac762e3d-99f1-442c-be4c-9b31c622a055"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.412600 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mc76\" (UniqueName: \"kubernetes.io/projected/ac762e3d-99f1-442c-be4c-9b31c622a055-kube-api-access-6mc76\") pod \"ac762e3d-99f1-442c-be4c-9b31c622a055\" (UID: \"ac762e3d-99f1-442c-be4c-9b31c622a055\") " Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.413506 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac762e3d-99f1-442c-be4c-9b31c622a055-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.417736 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac762e3d-99f1-442c-be4c-9b31c622a055-kube-api-access-6mc76" (OuterVolumeSpecName: "kube-api-access-6mc76") pod "ac762e3d-99f1-442c-be4c-9b31c622a055" (UID: "ac762e3d-99f1-442c-be4c-9b31c622a055"). InnerVolumeSpecName "kube-api-access-6mc76". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.419285 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac762e3d-99f1-442c-be4c-9b31c622a055-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ac762e3d-99f1-442c-be4c-9b31c622a055" (UID: "ac762e3d-99f1-442c-be4c-9b31c622a055"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.516458 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac762e3d-99f1-442c-be4c-9b31c622a055-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.516528 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mc76\" (UniqueName: \"kubernetes.io/projected/ac762e3d-99f1-442c-be4c-9b31c622a055-kube-api-access-6mc76\") on node \"crc\" DevicePath \"\"" Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.852441 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.852501 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.852551 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.853351 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ff95d3106ec58750562936f2aa4128ad082c64114f36d54205fbce24b521f3d"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.853414 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://8ff95d3106ec58750562936f2aa4128ad082c64114f36d54205fbce24b521f3d" gracePeriod=600 Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.912969 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5" event={"ID":"ac762e3d-99f1-442c-be4c-9b31c622a055","Type":"ContainerDied","Data":"1e30c1e22a86a0447b4b1523ed82dcb56416de34cd6d5b0c4b686bdeb760fbd6"} Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.913205 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e30c1e22a86a0447b4b1523ed82dcb56416de34cd6d5b0c4b686bdeb760fbd6" Jan 28 07:15:03 crc kubenswrapper[4776]: I0128 07:15:03.913060 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5" Jan 28 07:15:04 crc kubenswrapper[4776]: I0128 07:15:04.926357 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="8ff95d3106ec58750562936f2aa4128ad082c64114f36d54205fbce24b521f3d" exitCode=0 Jan 28 07:15:04 crc kubenswrapper[4776]: I0128 07:15:04.926493 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"8ff95d3106ec58750562936f2aa4128ad082c64114f36d54205fbce24b521f3d"} Jan 28 07:15:04 crc kubenswrapper[4776]: I0128 07:15:04.926953 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b"} Jan 28 07:15:04 crc kubenswrapper[4776]: I0128 07:15:04.926986 4776 scope.go:117] "RemoveContainer" containerID="b457c6912252b5a2b63136be6ba788b7188f99f3764ee744c448695e511964d0" Jan 28 07:15:20 crc kubenswrapper[4776]: I0128 07:15:20.890109 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-glgxk"] Jan 28 07:15:20 crc kubenswrapper[4776]: E0128 07:15:20.891444 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac762e3d-99f1-442c-be4c-9b31c622a055" containerName="collect-profiles" Jan 28 07:15:20 crc kubenswrapper[4776]: I0128 07:15:20.891461 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac762e3d-99f1-442c-be4c-9b31c622a055" containerName="collect-profiles" Jan 28 07:15:20 crc kubenswrapper[4776]: I0128 07:15:20.891691 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac762e3d-99f1-442c-be4c-9b31c622a055" containerName="collect-profiles" Jan 28 07:15:20 crc kubenswrapper[4776]: I0128 07:15:20.893456 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glgxk" Jan 28 07:15:20 crc kubenswrapper[4776]: I0128 07:15:20.910613 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-glgxk"] Jan 28 07:15:20 crc kubenswrapper[4776]: I0128 07:15:20.984619 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd8eff4-afa4-4015-810e-aff9f60b222d-catalog-content\") pod \"certified-operators-glgxk\" (UID: \"edd8eff4-afa4-4015-810e-aff9f60b222d\") " pod="openshift-marketplace/certified-operators-glgxk" Jan 28 07:15:20 crc kubenswrapper[4776]: I0128 07:15:20.984958 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd8eff4-afa4-4015-810e-aff9f60b222d-utilities\") pod \"certified-operators-glgxk\" (UID: \"edd8eff4-afa4-4015-810e-aff9f60b222d\") " pod="openshift-marketplace/certified-operators-glgxk" Jan 28 07:15:20 crc kubenswrapper[4776]: I0128 07:15:20.985056 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kwdm\" (UniqueName: \"kubernetes.io/projected/edd8eff4-afa4-4015-810e-aff9f60b222d-kube-api-access-5kwdm\") pod \"certified-operators-glgxk\" (UID: \"edd8eff4-afa4-4015-810e-aff9f60b222d\") " pod="openshift-marketplace/certified-operators-glgxk" Jan 28 07:15:21 crc kubenswrapper[4776]: I0128 07:15:21.087408 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd8eff4-afa4-4015-810e-aff9f60b222d-catalog-content\") pod \"certified-operators-glgxk\" (UID: \"edd8eff4-afa4-4015-810e-aff9f60b222d\") " pod="openshift-marketplace/certified-operators-glgxk" Jan 28 07:15:21 crc kubenswrapper[4776]: I0128 07:15:21.088057 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd8eff4-afa4-4015-810e-aff9f60b222d-utilities\") pod \"certified-operators-glgxk\" (UID: \"edd8eff4-afa4-4015-810e-aff9f60b222d\") " pod="openshift-marketplace/certified-operators-glgxk" Jan 28 07:15:21 crc kubenswrapper[4776]: I0128 07:15:21.088089 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kwdm\" (UniqueName: \"kubernetes.io/projected/edd8eff4-afa4-4015-810e-aff9f60b222d-kube-api-access-5kwdm\") pod \"certified-operators-glgxk\" (UID: \"edd8eff4-afa4-4015-810e-aff9f60b222d\") " pod="openshift-marketplace/certified-operators-glgxk" Jan 28 07:15:21 crc kubenswrapper[4776]: I0128 07:15:21.088972 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd8eff4-afa4-4015-810e-aff9f60b222d-catalog-content\") pod \"certified-operators-glgxk\" (UID: \"edd8eff4-afa4-4015-810e-aff9f60b222d\") " pod="openshift-marketplace/certified-operators-glgxk" Jan 28 07:15:21 crc kubenswrapper[4776]: I0128 07:15:21.089266 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd8eff4-afa4-4015-810e-aff9f60b222d-utilities\") pod \"certified-operators-glgxk\" (UID: \"edd8eff4-afa4-4015-810e-aff9f60b222d\") " pod="openshift-marketplace/certified-operators-glgxk" Jan 28 07:15:21 crc kubenswrapper[4776]: I0128 07:15:21.109928 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kwdm\" (UniqueName: \"kubernetes.io/projected/edd8eff4-afa4-4015-810e-aff9f60b222d-kube-api-access-5kwdm\") pod \"certified-operators-glgxk\" (UID: \"edd8eff4-afa4-4015-810e-aff9f60b222d\") " pod="openshift-marketplace/certified-operators-glgxk" Jan 28 07:15:21 crc kubenswrapper[4776]: I0128 07:15:21.262320 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glgxk" Jan 28 07:15:21 crc kubenswrapper[4776]: I0128 07:15:21.687873 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-glgxk"] Jan 28 07:15:22 crc kubenswrapper[4776]: I0128 07:15:22.109214 4776 generic.go:334] "Generic (PLEG): container finished" podID="edd8eff4-afa4-4015-810e-aff9f60b222d" containerID="0096d4c132929ea89c268ae3122eebda3e545daa0ceb7c18f3ad69dfb65629d9" exitCode=0 Jan 28 07:15:22 crc kubenswrapper[4776]: I0128 07:15:22.109268 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glgxk" event={"ID":"edd8eff4-afa4-4015-810e-aff9f60b222d","Type":"ContainerDied","Data":"0096d4c132929ea89c268ae3122eebda3e545daa0ceb7c18f3ad69dfb65629d9"} Jan 28 07:15:22 crc kubenswrapper[4776]: I0128 07:15:22.109757 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glgxk" event={"ID":"edd8eff4-afa4-4015-810e-aff9f60b222d","Type":"ContainerStarted","Data":"e0eef29586aa20bbc9d4b106cb01081e14d6396ae3cd3cebcb2cb2c3a1556fd5"} Jan 28 07:15:24 crc kubenswrapper[4776]: I0128 07:15:24.128470 4776 generic.go:334] "Generic (PLEG): container finished" podID="edd8eff4-afa4-4015-810e-aff9f60b222d" containerID="fa65e6e98dea139c5b404e9805cfb46d6cc10b1f66c19533ba144ea5c2df5cf2" exitCode=0 Jan 28 07:15:24 crc kubenswrapper[4776]: I0128 07:15:24.128950 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glgxk" event={"ID":"edd8eff4-afa4-4015-810e-aff9f60b222d","Type":"ContainerDied","Data":"fa65e6e98dea139c5b404e9805cfb46d6cc10b1f66c19533ba144ea5c2df5cf2"} Jan 28 07:15:25 crc kubenswrapper[4776]: I0128 07:15:25.139290 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glgxk" event={"ID":"edd8eff4-afa4-4015-810e-aff9f60b222d","Type":"ContainerStarted","Data":"23e40c987bc94988ee8eadb753eade6779b5cbbd219db9a6e76dd653a5e9d3fc"} Jan 28 07:15:25 crc kubenswrapper[4776]: I0128 07:15:25.161603 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-glgxk" podStartSLOduration=2.7375381389999998 podStartE2EDuration="5.161583628s" podCreationTimestamp="2026-01-28 07:15:20 +0000 UTC" firstStartedPulling="2026-01-28 07:15:22.1123632 +0000 UTC m=+1493.528023370" lastFinishedPulling="2026-01-28 07:15:24.536408689 +0000 UTC m=+1495.952068859" observedRunningTime="2026-01-28 07:15:25.159150202 +0000 UTC m=+1496.574810362" watchObservedRunningTime="2026-01-28 07:15:25.161583628 +0000 UTC m=+1496.577243788" Jan 28 07:15:31 crc kubenswrapper[4776]: I0128 07:15:31.262715 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-glgxk" Jan 28 07:15:31 crc kubenswrapper[4776]: I0128 07:15:31.263341 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-glgxk" Jan 28 07:15:31 crc kubenswrapper[4776]: I0128 07:15:31.329369 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-glgxk" Jan 28 07:15:32 crc kubenswrapper[4776]: I0128 07:15:32.270708 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-glgxk" Jan 28 07:15:32 crc kubenswrapper[4776]: I0128 07:15:32.328769 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-glgxk"] Jan 28 07:15:34 crc kubenswrapper[4776]: I0128 07:15:34.239296 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-glgxk" podUID="edd8eff4-afa4-4015-810e-aff9f60b222d" containerName="registry-server" containerID="cri-o://23e40c987bc94988ee8eadb753eade6779b5cbbd219db9a6e76dd653a5e9d3fc" gracePeriod=2 Jan 28 07:15:34 crc kubenswrapper[4776]: I0128 07:15:34.696791 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glgxk" Jan 28 07:15:34 crc kubenswrapper[4776]: I0128 07:15:34.886244 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd8eff4-afa4-4015-810e-aff9f60b222d-catalog-content\") pod \"edd8eff4-afa4-4015-810e-aff9f60b222d\" (UID: \"edd8eff4-afa4-4015-810e-aff9f60b222d\") " Jan 28 07:15:34 crc kubenswrapper[4776]: I0128 07:15:34.886364 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kwdm\" (UniqueName: \"kubernetes.io/projected/edd8eff4-afa4-4015-810e-aff9f60b222d-kube-api-access-5kwdm\") pod \"edd8eff4-afa4-4015-810e-aff9f60b222d\" (UID: \"edd8eff4-afa4-4015-810e-aff9f60b222d\") " Jan 28 07:15:34 crc kubenswrapper[4776]: I0128 07:15:34.886429 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd8eff4-afa4-4015-810e-aff9f60b222d-utilities\") pod \"edd8eff4-afa4-4015-810e-aff9f60b222d\" (UID: \"edd8eff4-afa4-4015-810e-aff9f60b222d\") " Jan 28 07:15:34 crc kubenswrapper[4776]: I0128 07:15:34.888516 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edd8eff4-afa4-4015-810e-aff9f60b222d-utilities" (OuterVolumeSpecName: "utilities") pod "edd8eff4-afa4-4015-810e-aff9f60b222d" (UID: "edd8eff4-afa4-4015-810e-aff9f60b222d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:15:34 crc kubenswrapper[4776]: I0128 07:15:34.895648 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd8eff4-afa4-4015-810e-aff9f60b222d-kube-api-access-5kwdm" (OuterVolumeSpecName: "kube-api-access-5kwdm") pod "edd8eff4-afa4-4015-810e-aff9f60b222d" (UID: "edd8eff4-afa4-4015-810e-aff9f60b222d"). InnerVolumeSpecName "kube-api-access-5kwdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:15:34 crc kubenswrapper[4776]: I0128 07:15:34.933901 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edd8eff4-afa4-4015-810e-aff9f60b222d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edd8eff4-afa4-4015-810e-aff9f60b222d" (UID: "edd8eff4-afa4-4015-810e-aff9f60b222d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:15:34 crc kubenswrapper[4776]: I0128 07:15:34.989115 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd8eff4-afa4-4015-810e-aff9f60b222d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:15:34 crc kubenswrapper[4776]: I0128 07:15:34.989168 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kwdm\" (UniqueName: \"kubernetes.io/projected/edd8eff4-afa4-4015-810e-aff9f60b222d-kube-api-access-5kwdm\") on node \"crc\" DevicePath \"\"" Jan 28 07:15:34 crc kubenswrapper[4776]: I0128 07:15:34.989184 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd8eff4-afa4-4015-810e-aff9f60b222d-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:15:35 crc kubenswrapper[4776]: I0128 07:15:35.250538 4776 generic.go:334] "Generic (PLEG): container finished" podID="edd8eff4-afa4-4015-810e-aff9f60b222d" containerID="23e40c987bc94988ee8eadb753eade6779b5cbbd219db9a6e76dd653a5e9d3fc" exitCode=0 Jan 28 07:15:35 crc kubenswrapper[4776]: I0128 07:15:35.250646 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glgxk" Jan 28 07:15:35 crc kubenswrapper[4776]: I0128 07:15:35.251711 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glgxk" event={"ID":"edd8eff4-afa4-4015-810e-aff9f60b222d","Type":"ContainerDied","Data":"23e40c987bc94988ee8eadb753eade6779b5cbbd219db9a6e76dd653a5e9d3fc"} Jan 28 07:15:35 crc kubenswrapper[4776]: I0128 07:15:35.251803 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glgxk" event={"ID":"edd8eff4-afa4-4015-810e-aff9f60b222d","Type":"ContainerDied","Data":"e0eef29586aa20bbc9d4b106cb01081e14d6396ae3cd3cebcb2cb2c3a1556fd5"} Jan 28 07:15:35 crc kubenswrapper[4776]: I0128 07:15:35.251848 4776 scope.go:117] "RemoveContainer" containerID="23e40c987bc94988ee8eadb753eade6779b5cbbd219db9a6e76dd653a5e9d3fc" Jan 28 07:15:35 crc kubenswrapper[4776]: I0128 07:15:35.284856 4776 scope.go:117] "RemoveContainer" containerID="fa65e6e98dea139c5b404e9805cfb46d6cc10b1f66c19533ba144ea5c2df5cf2" Jan 28 07:15:35 crc kubenswrapper[4776]: I0128 07:15:35.287076 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-glgxk"] Jan 28 07:15:35 crc kubenswrapper[4776]: I0128 07:15:35.298456 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-glgxk"] Jan 28 07:15:35 crc kubenswrapper[4776]: I0128 07:15:35.319491 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edd8eff4-afa4-4015-810e-aff9f60b222d" path="/var/lib/kubelet/pods/edd8eff4-afa4-4015-810e-aff9f60b222d/volumes" Jan 28 07:15:35 crc kubenswrapper[4776]: I0128 07:15:35.326746 4776 scope.go:117] "RemoveContainer" containerID="0096d4c132929ea89c268ae3122eebda3e545daa0ceb7c18f3ad69dfb65629d9" Jan 28 07:15:35 crc kubenswrapper[4776]: I0128 07:15:35.385209 4776 scope.go:117] "RemoveContainer" containerID="23e40c987bc94988ee8eadb753eade6779b5cbbd219db9a6e76dd653a5e9d3fc" Jan 28 07:15:35 crc kubenswrapper[4776]: E0128 07:15:35.390259 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23e40c987bc94988ee8eadb753eade6779b5cbbd219db9a6e76dd653a5e9d3fc\": container with ID starting with 23e40c987bc94988ee8eadb753eade6779b5cbbd219db9a6e76dd653a5e9d3fc not found: ID does not exist" containerID="23e40c987bc94988ee8eadb753eade6779b5cbbd219db9a6e76dd653a5e9d3fc" Jan 28 07:15:35 crc kubenswrapper[4776]: I0128 07:15:35.390316 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23e40c987bc94988ee8eadb753eade6779b5cbbd219db9a6e76dd653a5e9d3fc"} err="failed to get container status \"23e40c987bc94988ee8eadb753eade6779b5cbbd219db9a6e76dd653a5e9d3fc\": rpc error: code = NotFound desc = could not find container \"23e40c987bc94988ee8eadb753eade6779b5cbbd219db9a6e76dd653a5e9d3fc\": container with ID starting with 23e40c987bc94988ee8eadb753eade6779b5cbbd219db9a6e76dd653a5e9d3fc not found: ID does not exist" Jan 28 07:15:35 crc kubenswrapper[4776]: I0128 07:15:35.390352 4776 scope.go:117] "RemoveContainer" containerID="fa65e6e98dea139c5b404e9805cfb46d6cc10b1f66c19533ba144ea5c2df5cf2" Jan 28 07:15:35 crc kubenswrapper[4776]: E0128 07:15:35.393995 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa65e6e98dea139c5b404e9805cfb46d6cc10b1f66c19533ba144ea5c2df5cf2\": container with ID starting with fa65e6e98dea139c5b404e9805cfb46d6cc10b1f66c19533ba144ea5c2df5cf2 not found: ID does not exist" containerID="fa65e6e98dea139c5b404e9805cfb46d6cc10b1f66c19533ba144ea5c2df5cf2" Jan 28 07:15:35 crc kubenswrapper[4776]: I0128 07:15:35.394043 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa65e6e98dea139c5b404e9805cfb46d6cc10b1f66c19533ba144ea5c2df5cf2"} err="failed to get container status \"fa65e6e98dea139c5b404e9805cfb46d6cc10b1f66c19533ba144ea5c2df5cf2\": rpc error: code = NotFound desc = could not find container \"fa65e6e98dea139c5b404e9805cfb46d6cc10b1f66c19533ba144ea5c2df5cf2\": container with ID starting with fa65e6e98dea139c5b404e9805cfb46d6cc10b1f66c19533ba144ea5c2df5cf2 not found: ID does not exist" Jan 28 07:15:35 crc kubenswrapper[4776]: I0128 07:15:35.394077 4776 scope.go:117] "RemoveContainer" containerID="0096d4c132929ea89c268ae3122eebda3e545daa0ceb7c18f3ad69dfb65629d9" Jan 28 07:15:35 crc kubenswrapper[4776]: E0128 07:15:35.394397 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0096d4c132929ea89c268ae3122eebda3e545daa0ceb7c18f3ad69dfb65629d9\": container with ID starting with 0096d4c132929ea89c268ae3122eebda3e545daa0ceb7c18f3ad69dfb65629d9 not found: ID does not exist" containerID="0096d4c132929ea89c268ae3122eebda3e545daa0ceb7c18f3ad69dfb65629d9" Jan 28 07:15:35 crc kubenswrapper[4776]: I0128 07:15:35.394437 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0096d4c132929ea89c268ae3122eebda3e545daa0ceb7c18f3ad69dfb65629d9"} err="failed to get container status \"0096d4c132929ea89c268ae3122eebda3e545daa0ceb7c18f3ad69dfb65629d9\": rpc error: code = NotFound desc = could not find container \"0096d4c132929ea89c268ae3122eebda3e545daa0ceb7c18f3ad69dfb65629d9\": container with ID starting with 0096d4c132929ea89c268ae3122eebda3e545daa0ceb7c18f3ad69dfb65629d9 not found: ID does not exist" Jan 28 07:16:02 crc kubenswrapper[4776]: I0128 07:16:02.543934 4776 generic.go:334] "Generic (PLEG): container finished" podID="9a3467e9-b4e8-40f9-8e96-3615aa7248ca" containerID="b3995b507a3fb5531b94b696cc36710cd03299369d640414cabf5a0ab72f1135" exitCode=0 Jan 28 07:16:02 crc kubenswrapper[4776]: I0128 07:16:02.543995 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" event={"ID":"9a3467e9-b4e8-40f9-8e96-3615aa7248ca","Type":"ContainerDied","Data":"b3995b507a3fb5531b94b696cc36710cd03299369d640414cabf5a0ab72f1135"} Jan 28 07:16:03 crc kubenswrapper[4776]: I0128 07:16:03.956725 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.029665 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-ssh-key-openstack-edpm-ipam\") pod \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\" (UID: \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\") " Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.029734 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-bootstrap-combined-ca-bundle\") pod \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\" (UID: \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\") " Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.029767 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x58tk\" (UniqueName: \"kubernetes.io/projected/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-kube-api-access-x58tk\") pod \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\" (UID: \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\") " Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.029797 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-inventory\") pod \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\" (UID: \"9a3467e9-b4e8-40f9-8e96-3615aa7248ca\") " Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.035389 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-kube-api-access-x58tk" (OuterVolumeSpecName: "kube-api-access-x58tk") pod "9a3467e9-b4e8-40f9-8e96-3615aa7248ca" (UID: "9a3467e9-b4e8-40f9-8e96-3615aa7248ca"). InnerVolumeSpecName "kube-api-access-x58tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.035473 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9a3467e9-b4e8-40f9-8e96-3615aa7248ca" (UID: "9a3467e9-b4e8-40f9-8e96-3615aa7248ca"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.059762 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9a3467e9-b4e8-40f9-8e96-3615aa7248ca" (UID: "9a3467e9-b4e8-40f9-8e96-3615aa7248ca"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.071452 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-inventory" (OuterVolumeSpecName: "inventory") pod "9a3467e9-b4e8-40f9-8e96-3615aa7248ca" (UID: "9a3467e9-b4e8-40f9-8e96-3615aa7248ca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.132850 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.132923 4776 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.132943 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x58tk\" (UniqueName: \"kubernetes.io/projected/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-kube-api-access-x58tk\") on node \"crc\" DevicePath \"\"" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.132960 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a3467e9-b4e8-40f9-8e96-3615aa7248ca-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.569645 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" event={"ID":"9a3467e9-b4e8-40f9-8e96-3615aa7248ca","Type":"ContainerDied","Data":"2fc103fd2d133d4c58becf9da23d0de6befaee317daf99bbe311492a4360c7e6"} Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.569694 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fc103fd2d133d4c58becf9da23d0de6befaee317daf99bbe311492a4360c7e6" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.569760 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.699848 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp"] Jan 28 07:16:04 crc kubenswrapper[4776]: E0128 07:16:04.703391 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd8eff4-afa4-4015-810e-aff9f60b222d" containerName="registry-server" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.703433 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd8eff4-afa4-4015-810e-aff9f60b222d" containerName="registry-server" Jan 28 07:16:04 crc kubenswrapper[4776]: E0128 07:16:04.703476 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd8eff4-afa4-4015-810e-aff9f60b222d" containerName="extract-utilities" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.703491 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd8eff4-afa4-4015-810e-aff9f60b222d" containerName="extract-utilities" Jan 28 07:16:04 crc kubenswrapper[4776]: E0128 07:16:04.703518 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd8eff4-afa4-4015-810e-aff9f60b222d" containerName="extract-content" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.703530 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd8eff4-afa4-4015-810e-aff9f60b222d" containerName="extract-content" Jan 28 07:16:04 crc kubenswrapper[4776]: E0128 07:16:04.703607 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a3467e9-b4e8-40f9-8e96-3615aa7248ca" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.703621 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3467e9-b4e8-40f9-8e96-3615aa7248ca" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.704350 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="edd8eff4-afa4-4015-810e-aff9f60b222d" containerName="registry-server" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.704429 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a3467e9-b4e8-40f9-8e96-3615aa7248ca" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.705825 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.713337 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cl6qn" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.713725 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.714043 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.714256 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.734758 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp"] Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.818668 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkrvx\" (UniqueName: \"kubernetes.io/projected/ea52630b-ebcc-41d5-9265-eec1e8ae437d-kube-api-access-pkrvx\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp\" (UID: \"ea52630b-ebcc-41d5-9265-eec1e8ae437d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.818874 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea52630b-ebcc-41d5-9265-eec1e8ae437d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp\" (UID: \"ea52630b-ebcc-41d5-9265-eec1e8ae437d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.818918 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea52630b-ebcc-41d5-9265-eec1e8ae437d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp\" (UID: \"ea52630b-ebcc-41d5-9265-eec1e8ae437d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.920760 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkrvx\" (UniqueName: \"kubernetes.io/projected/ea52630b-ebcc-41d5-9265-eec1e8ae437d-kube-api-access-pkrvx\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp\" (UID: \"ea52630b-ebcc-41d5-9265-eec1e8ae437d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.921237 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea52630b-ebcc-41d5-9265-eec1e8ae437d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp\" (UID: \"ea52630b-ebcc-41d5-9265-eec1e8ae437d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.921378 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea52630b-ebcc-41d5-9265-eec1e8ae437d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp\" (UID: \"ea52630b-ebcc-41d5-9265-eec1e8ae437d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.926028 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea52630b-ebcc-41d5-9265-eec1e8ae437d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp\" (UID: \"ea52630b-ebcc-41d5-9265-eec1e8ae437d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.928911 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea52630b-ebcc-41d5-9265-eec1e8ae437d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp\" (UID: \"ea52630b-ebcc-41d5-9265-eec1e8ae437d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" Jan 28 07:16:04 crc kubenswrapper[4776]: I0128 07:16:04.941470 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkrvx\" (UniqueName: \"kubernetes.io/projected/ea52630b-ebcc-41d5-9265-eec1e8ae437d-kube-api-access-pkrvx\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp\" (UID: \"ea52630b-ebcc-41d5-9265-eec1e8ae437d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" Jan 28 07:16:05 crc kubenswrapper[4776]: I0128 07:16:05.031421 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" Jan 28 07:16:05 crc kubenswrapper[4776]: I0128 07:16:05.557961 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp"] Jan 28 07:16:05 crc kubenswrapper[4776]: I0128 07:16:05.558002 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 07:16:05 crc kubenswrapper[4776]: I0128 07:16:05.581113 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" event={"ID":"ea52630b-ebcc-41d5-9265-eec1e8ae437d","Type":"ContainerStarted","Data":"a1b1e5a058efb3dfd2d5cdd04cc531baab0277488ade1aad7f3cc558593c337b"} Jan 28 07:16:05 crc kubenswrapper[4776]: I0128 07:16:05.650927 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n6xfv"] Jan 28 07:16:05 crc kubenswrapper[4776]: I0128 07:16:05.654226 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n6xfv" Jan 28 07:16:05 crc kubenswrapper[4776]: I0128 07:16:05.661566 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n6xfv"] Jan 28 07:16:05 crc kubenswrapper[4776]: I0128 07:16:05.738494 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ce633a-e963-458c-bbed-900cc02cedbb-utilities\") pod \"redhat-operators-n6xfv\" (UID: \"a7ce633a-e963-458c-bbed-900cc02cedbb\") " pod="openshift-marketplace/redhat-operators-n6xfv" Jan 28 07:16:05 crc kubenswrapper[4776]: I0128 07:16:05.738617 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ce633a-e963-458c-bbed-900cc02cedbb-catalog-content\") pod \"redhat-operators-n6xfv\" (UID: \"a7ce633a-e963-458c-bbed-900cc02cedbb\") " pod="openshift-marketplace/redhat-operators-n6xfv" Jan 28 07:16:05 crc kubenswrapper[4776]: I0128 07:16:05.738778 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55txw\" (UniqueName: \"kubernetes.io/projected/a7ce633a-e963-458c-bbed-900cc02cedbb-kube-api-access-55txw\") pod \"redhat-operators-n6xfv\" (UID: \"a7ce633a-e963-458c-bbed-900cc02cedbb\") " pod="openshift-marketplace/redhat-operators-n6xfv" Jan 28 07:16:05 crc kubenswrapper[4776]: I0128 07:16:05.840739 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ce633a-e963-458c-bbed-900cc02cedbb-catalog-content\") pod \"redhat-operators-n6xfv\" (UID: \"a7ce633a-e963-458c-bbed-900cc02cedbb\") " pod="openshift-marketplace/redhat-operators-n6xfv" Jan 28 07:16:05 crc kubenswrapper[4776]: I0128 07:16:05.841081 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55txw\" (UniqueName: \"kubernetes.io/projected/a7ce633a-e963-458c-bbed-900cc02cedbb-kube-api-access-55txw\") pod \"redhat-operators-n6xfv\" (UID: \"a7ce633a-e963-458c-bbed-900cc02cedbb\") " pod="openshift-marketplace/redhat-operators-n6xfv" Jan 28 07:16:05 crc kubenswrapper[4776]: I0128 07:16:05.841222 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ce633a-e963-458c-bbed-900cc02cedbb-utilities\") pod \"redhat-operators-n6xfv\" (UID: \"a7ce633a-e963-458c-bbed-900cc02cedbb\") " pod="openshift-marketplace/redhat-operators-n6xfv" Jan 28 07:16:05 crc kubenswrapper[4776]: I0128 07:16:05.841721 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ce633a-e963-458c-bbed-900cc02cedbb-utilities\") pod \"redhat-operators-n6xfv\" (UID: \"a7ce633a-e963-458c-bbed-900cc02cedbb\") " pod="openshift-marketplace/redhat-operators-n6xfv" Jan 28 07:16:05 crc kubenswrapper[4776]: I0128 07:16:05.841841 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ce633a-e963-458c-bbed-900cc02cedbb-catalog-content\") pod \"redhat-operators-n6xfv\" (UID: \"a7ce633a-e963-458c-bbed-900cc02cedbb\") " pod="openshift-marketplace/redhat-operators-n6xfv" Jan 28 07:16:05 crc kubenswrapper[4776]: I0128 07:16:05.860859 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55txw\" (UniqueName: \"kubernetes.io/projected/a7ce633a-e963-458c-bbed-900cc02cedbb-kube-api-access-55txw\") pod \"redhat-operators-n6xfv\" (UID: \"a7ce633a-e963-458c-bbed-900cc02cedbb\") " pod="openshift-marketplace/redhat-operators-n6xfv" Jan 28 07:16:05 crc kubenswrapper[4776]: I0128 07:16:05.990015 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n6xfv" Jan 28 07:16:06 crc kubenswrapper[4776]: I0128 07:16:06.438004 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n6xfv"] Jan 28 07:16:06 crc kubenswrapper[4776]: W0128 07:16:06.440305 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7ce633a_e963_458c_bbed_900cc02cedbb.slice/crio-de2cfe3b19f555d688ac02fb85738b0ab80590193babbddf9e301741c96705b3 WatchSource:0}: Error finding container de2cfe3b19f555d688ac02fb85738b0ab80590193babbddf9e301741c96705b3: Status 404 returned error can't find the container with id de2cfe3b19f555d688ac02fb85738b0ab80590193babbddf9e301741c96705b3 Jan 28 07:16:06 crc kubenswrapper[4776]: I0128 07:16:06.593956 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xfv" event={"ID":"a7ce633a-e963-458c-bbed-900cc02cedbb","Type":"ContainerStarted","Data":"de2cfe3b19f555d688ac02fb85738b0ab80590193babbddf9e301741c96705b3"} Jan 28 07:16:06 crc kubenswrapper[4776]: I0128 07:16:06.595525 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" event={"ID":"ea52630b-ebcc-41d5-9265-eec1e8ae437d","Type":"ContainerStarted","Data":"f6abd2b6acb94cf279e6215537b7b485f529c4d7187eed46bd600dd72654cc57"} Jan 28 07:16:06 crc kubenswrapper[4776]: I0128 07:16:06.617485 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" podStartSLOduration=2.217811238 podStartE2EDuration="2.617427229s" podCreationTimestamp="2026-01-28 07:16:04 +0000 UTC" firstStartedPulling="2026-01-28 07:16:05.557728869 +0000 UTC m=+1536.973389039" lastFinishedPulling="2026-01-28 07:16:05.95734487 +0000 UTC m=+1537.373005030" observedRunningTime="2026-01-28 07:16:06.611623072 +0000 UTC m=+1538.027283232" watchObservedRunningTime="2026-01-28 07:16:06.617427229 +0000 UTC m=+1538.033087389" Jan 28 07:16:07 crc kubenswrapper[4776]: I0128 07:16:07.607760 4776 generic.go:334] "Generic (PLEG): container finished" podID="a7ce633a-e963-458c-bbed-900cc02cedbb" containerID="42df5ae454ff4da7abe0ba1d809fb49ad4e5e4c29cae155e013e14bea5664c4f" exitCode=0 Jan 28 07:16:07 crc kubenswrapper[4776]: I0128 07:16:07.607861 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xfv" event={"ID":"a7ce633a-e963-458c-bbed-900cc02cedbb","Type":"ContainerDied","Data":"42df5ae454ff4da7abe0ba1d809fb49ad4e5e4c29cae155e013e14bea5664c4f"} Jan 28 07:16:09 crc kubenswrapper[4776]: I0128 07:16:09.630207 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xfv" event={"ID":"a7ce633a-e963-458c-bbed-900cc02cedbb","Type":"ContainerStarted","Data":"dfcc0f8af2cfe1b38b6a5aaa9f1430254ca22894ffd7df814711e654c3967de5"} Jan 28 07:16:10 crc kubenswrapper[4776]: I0128 07:16:10.642613 4776 generic.go:334] "Generic (PLEG): container finished" podID="a7ce633a-e963-458c-bbed-900cc02cedbb" containerID="dfcc0f8af2cfe1b38b6a5aaa9f1430254ca22894ffd7df814711e654c3967de5" exitCode=0 Jan 28 07:16:10 crc kubenswrapper[4776]: I0128 07:16:10.642667 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xfv" event={"ID":"a7ce633a-e963-458c-bbed-900cc02cedbb","Type":"ContainerDied","Data":"dfcc0f8af2cfe1b38b6a5aaa9f1430254ca22894ffd7df814711e654c3967de5"} Jan 28 07:16:11 crc kubenswrapper[4776]: I0128 07:16:11.652875 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xfv" event={"ID":"a7ce633a-e963-458c-bbed-900cc02cedbb","Type":"ContainerStarted","Data":"e3f731bebcce646919f7b6aa491b2ef0f63b3eb2062764c780b2abb96ccd0857"} Jan 28 07:16:11 crc kubenswrapper[4776]: I0128 07:16:11.675012 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n6xfv" podStartSLOduration=3.245600039 podStartE2EDuration="6.674993034s" podCreationTimestamp="2026-01-28 07:16:05 +0000 UTC" firstStartedPulling="2026-01-28 07:16:07.61077358 +0000 UTC m=+1539.026433780" lastFinishedPulling="2026-01-28 07:16:11.040166615 +0000 UTC m=+1542.455826775" observedRunningTime="2026-01-28 07:16:11.67297867 +0000 UTC m=+1543.088638850" watchObservedRunningTime="2026-01-28 07:16:11.674993034 +0000 UTC m=+1543.090653194" Jan 28 07:16:15 crc kubenswrapper[4776]: I0128 07:16:15.990619 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n6xfv" Jan 28 07:16:15 crc kubenswrapper[4776]: I0128 07:16:15.991178 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n6xfv" Jan 28 07:16:17 crc kubenswrapper[4776]: I0128 07:16:17.080714 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n6xfv" podUID="a7ce633a-e963-458c-bbed-900cc02cedbb" containerName="registry-server" probeResult="failure" output=< Jan 28 07:16:17 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Jan 28 07:16:17 crc kubenswrapper[4776]: > Jan 28 07:16:26 crc kubenswrapper[4776]: I0128 07:16:26.045706 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n6xfv" Jan 28 07:16:26 crc kubenswrapper[4776]: I0128 07:16:26.111226 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n6xfv" Jan 28 07:16:26 crc kubenswrapper[4776]: I0128 07:16:26.286408 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n6xfv"] Jan 28 07:16:27 crc kubenswrapper[4776]: I0128 07:16:27.808363 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n6xfv" podUID="a7ce633a-e963-458c-bbed-900cc02cedbb" containerName="registry-server" containerID="cri-o://e3f731bebcce646919f7b6aa491b2ef0f63b3eb2062764c780b2abb96ccd0857" gracePeriod=2 Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.347639 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n6xfv" Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.431877 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55txw\" (UniqueName: \"kubernetes.io/projected/a7ce633a-e963-458c-bbed-900cc02cedbb-kube-api-access-55txw\") pod \"a7ce633a-e963-458c-bbed-900cc02cedbb\" (UID: \"a7ce633a-e963-458c-bbed-900cc02cedbb\") " Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.431967 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ce633a-e963-458c-bbed-900cc02cedbb-utilities\") pod \"a7ce633a-e963-458c-bbed-900cc02cedbb\" (UID: \"a7ce633a-e963-458c-bbed-900cc02cedbb\") " Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.432030 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ce633a-e963-458c-bbed-900cc02cedbb-catalog-content\") pod \"a7ce633a-e963-458c-bbed-900cc02cedbb\" (UID: \"a7ce633a-e963-458c-bbed-900cc02cedbb\") " Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.433629 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7ce633a-e963-458c-bbed-900cc02cedbb-utilities" (OuterVolumeSpecName: "utilities") pod "a7ce633a-e963-458c-bbed-900cc02cedbb" (UID: "a7ce633a-e963-458c-bbed-900cc02cedbb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.440108 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7ce633a-e963-458c-bbed-900cc02cedbb-kube-api-access-55txw" (OuterVolumeSpecName: "kube-api-access-55txw") pod "a7ce633a-e963-458c-bbed-900cc02cedbb" (UID: "a7ce633a-e963-458c-bbed-900cc02cedbb"). InnerVolumeSpecName "kube-api-access-55txw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.537016 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55txw\" (UniqueName: \"kubernetes.io/projected/a7ce633a-e963-458c-bbed-900cc02cedbb-kube-api-access-55txw\") on node \"crc\" DevicePath \"\"" Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.537053 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ce633a-e963-458c-bbed-900cc02cedbb-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.550792 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7ce633a-e963-458c-bbed-900cc02cedbb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7ce633a-e963-458c-bbed-900cc02cedbb" (UID: "a7ce633a-e963-458c-bbed-900cc02cedbb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.639003 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ce633a-e963-458c-bbed-900cc02cedbb-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.828318 4776 generic.go:334] "Generic (PLEG): container finished" podID="a7ce633a-e963-458c-bbed-900cc02cedbb" containerID="e3f731bebcce646919f7b6aa491b2ef0f63b3eb2062764c780b2abb96ccd0857" exitCode=0 Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.828403 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xfv" event={"ID":"a7ce633a-e963-458c-bbed-900cc02cedbb","Type":"ContainerDied","Data":"e3f731bebcce646919f7b6aa491b2ef0f63b3eb2062764c780b2abb96ccd0857"} Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.828457 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xfv" event={"ID":"a7ce633a-e963-458c-bbed-900cc02cedbb","Type":"ContainerDied","Data":"de2cfe3b19f555d688ac02fb85738b0ab80590193babbddf9e301741c96705b3"} Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.828460 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n6xfv" Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.828487 4776 scope.go:117] "RemoveContainer" containerID="e3f731bebcce646919f7b6aa491b2ef0f63b3eb2062764c780b2abb96ccd0857" Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.866881 4776 scope.go:117] "RemoveContainer" containerID="dfcc0f8af2cfe1b38b6a5aaa9f1430254ca22894ffd7df814711e654c3967de5" Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.886861 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n6xfv"] Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.890685 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n6xfv"] Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.917915 4776 scope.go:117] "RemoveContainer" containerID="42df5ae454ff4da7abe0ba1d809fb49ad4e5e4c29cae155e013e14bea5664c4f" Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.958024 4776 scope.go:117] "RemoveContainer" containerID="e3f731bebcce646919f7b6aa491b2ef0f63b3eb2062764c780b2abb96ccd0857" Jan 28 07:16:28 crc kubenswrapper[4776]: E0128 07:16:28.958449 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3f731bebcce646919f7b6aa491b2ef0f63b3eb2062764c780b2abb96ccd0857\": container with ID starting with e3f731bebcce646919f7b6aa491b2ef0f63b3eb2062764c780b2abb96ccd0857 not found: ID does not exist" containerID="e3f731bebcce646919f7b6aa491b2ef0f63b3eb2062764c780b2abb96ccd0857" Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.958506 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3f731bebcce646919f7b6aa491b2ef0f63b3eb2062764c780b2abb96ccd0857"} err="failed to get container status \"e3f731bebcce646919f7b6aa491b2ef0f63b3eb2062764c780b2abb96ccd0857\": rpc error: code = NotFound desc = could not find container \"e3f731bebcce646919f7b6aa491b2ef0f63b3eb2062764c780b2abb96ccd0857\": container with ID starting with e3f731bebcce646919f7b6aa491b2ef0f63b3eb2062764c780b2abb96ccd0857 not found: ID does not exist" Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.958635 4776 scope.go:117] "RemoveContainer" containerID="dfcc0f8af2cfe1b38b6a5aaa9f1430254ca22894ffd7df814711e654c3967de5" Jan 28 07:16:28 crc kubenswrapper[4776]: E0128 07:16:28.959054 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfcc0f8af2cfe1b38b6a5aaa9f1430254ca22894ffd7df814711e654c3967de5\": container with ID starting with dfcc0f8af2cfe1b38b6a5aaa9f1430254ca22894ffd7df814711e654c3967de5 not found: ID does not exist" containerID="dfcc0f8af2cfe1b38b6a5aaa9f1430254ca22894ffd7df814711e654c3967de5" Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.959101 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfcc0f8af2cfe1b38b6a5aaa9f1430254ca22894ffd7df814711e654c3967de5"} err="failed to get container status \"dfcc0f8af2cfe1b38b6a5aaa9f1430254ca22894ffd7df814711e654c3967de5\": rpc error: code = NotFound desc = could not find container \"dfcc0f8af2cfe1b38b6a5aaa9f1430254ca22894ffd7df814711e654c3967de5\": container with ID starting with dfcc0f8af2cfe1b38b6a5aaa9f1430254ca22894ffd7df814711e654c3967de5 not found: ID does not exist" Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.959127 4776 scope.go:117] "RemoveContainer" containerID="42df5ae454ff4da7abe0ba1d809fb49ad4e5e4c29cae155e013e14bea5664c4f" Jan 28 07:16:28 crc kubenswrapper[4776]: E0128 07:16:28.959408 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42df5ae454ff4da7abe0ba1d809fb49ad4e5e4c29cae155e013e14bea5664c4f\": container with ID starting with 42df5ae454ff4da7abe0ba1d809fb49ad4e5e4c29cae155e013e14bea5664c4f not found: ID does not exist" containerID="42df5ae454ff4da7abe0ba1d809fb49ad4e5e4c29cae155e013e14bea5664c4f" Jan 28 07:16:28 crc kubenswrapper[4776]: I0128 07:16:28.959449 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42df5ae454ff4da7abe0ba1d809fb49ad4e5e4c29cae155e013e14bea5664c4f"} err="failed to get container status \"42df5ae454ff4da7abe0ba1d809fb49ad4e5e4c29cae155e013e14bea5664c4f\": rpc error: code = NotFound desc = could not find container \"42df5ae454ff4da7abe0ba1d809fb49ad4e5e4c29cae155e013e14bea5664c4f\": container with ID starting with 42df5ae454ff4da7abe0ba1d809fb49ad4e5e4c29cae155e013e14bea5664c4f not found: ID does not exist" Jan 28 07:16:29 crc kubenswrapper[4776]: I0128 07:16:29.321762 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7ce633a-e963-458c-bbed-900cc02cedbb" path="/var/lib/kubelet/pods/a7ce633a-e963-458c-bbed-900cc02cedbb/volumes" Jan 28 07:16:46 crc kubenswrapper[4776]: I0128 07:16:46.413938 4776 scope.go:117] "RemoveContainer" containerID="9afeb46ec6221baa5d09d716d85530c311c743b5e7b9c4cd0c5b9366a3c36871" Jan 28 07:16:46 crc kubenswrapper[4776]: I0128 07:16:46.435106 4776 scope.go:117] "RemoveContainer" containerID="348a6e05a4815ebd00a8a79860c0672a83948db9951a31c1eb3b590317f19f7b" Jan 28 07:16:46 crc kubenswrapper[4776]: I0128 07:16:46.462463 4776 scope.go:117] "RemoveContainer" containerID="9c52a0d602820b67b5a7ee93eb456be265980e38ba4b5447d30e9560beaa0ca1" Jan 28 07:16:46 crc kubenswrapper[4776]: I0128 07:16:46.481502 4776 scope.go:117] "RemoveContainer" containerID="c4afce0457e6f296bd9294666fe4eb45f70f3f6fc5bb1d5483019aec01879d74" Jan 28 07:16:46 crc kubenswrapper[4776]: I0128 07:16:46.500676 4776 scope.go:117] "RemoveContainer" containerID="630011fce5539fafb71d6ab076243d692a62b66f8575f2a229b500205a3ab931" Jan 28 07:16:46 crc kubenswrapper[4776]: I0128 07:16:46.551450 4776 scope.go:117] "RemoveContainer" containerID="c720c04389fda0e82cedc90e2414fe31334a4c299f39a9a801fdeacad900f1d9" Jan 28 07:16:46 crc kubenswrapper[4776]: I0128 07:16:46.570500 4776 scope.go:117] "RemoveContainer" containerID="35b2655d70ebcd2a7ec9d07be917635f57b04a5d592a537d2ab2320c9efc0517" Jan 28 07:16:46 crc kubenswrapper[4776]: I0128 07:16:46.588079 4776 scope.go:117] "RemoveContainer" containerID="befe3a9224afd3d6a5d7170057eca386d4d9340503320d60f75d312188731c63" Jan 28 07:16:46 crc kubenswrapper[4776]: I0128 07:16:46.604116 4776 scope.go:117] "RemoveContainer" containerID="049705c2e9b72b399b18a24b9a710269ecb8af45f04cd1b326c3db1711c29dde" Jan 28 07:16:54 crc kubenswrapper[4776]: I0128 07:16:54.058296 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-d25e-account-create-update-txwtx"] Jan 28 07:16:54 crc kubenswrapper[4776]: I0128 07:16:54.075367 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-9rmhk"] Jan 28 07:16:54 crc kubenswrapper[4776]: I0128 07:16:54.090473 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-294lg"] Jan 28 07:16:54 crc kubenswrapper[4776]: I0128 07:16:54.100195 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-87a7-account-create-update-f8927"] Jan 28 07:16:54 crc kubenswrapper[4776]: I0128 07:16:54.108683 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-fc6c-account-create-update-h8spp"] Jan 28 07:16:54 crc kubenswrapper[4776]: I0128 07:16:54.116731 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-fjb7k"] Jan 28 07:16:54 crc kubenswrapper[4776]: I0128 07:16:54.124757 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-fjb7k"] Jan 28 07:16:54 crc kubenswrapper[4776]: I0128 07:16:54.133320 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-87a7-account-create-update-f8927"] Jan 28 07:16:54 crc kubenswrapper[4776]: I0128 07:16:54.142020 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-9rmhk"] Jan 28 07:16:54 crc kubenswrapper[4776]: I0128 07:16:54.150076 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-294lg"] Jan 28 07:16:54 crc kubenswrapper[4776]: I0128 07:16:54.159821 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-d25e-account-create-update-txwtx"] Jan 28 07:16:54 crc kubenswrapper[4776]: I0128 07:16:54.169760 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-fc6c-account-create-update-h8spp"] Jan 28 07:16:55 crc kubenswrapper[4776]: I0128 07:16:55.324741 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56e1e400-5c65-433a-a52e-01160edeb76a" path="/var/lib/kubelet/pods/56e1e400-5c65-433a-a52e-01160edeb76a/volumes" Jan 28 07:16:55 crc kubenswrapper[4776]: I0128 07:16:55.326138 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c623f99-9905-4a7e-addc-022e48fb40bf" path="/var/lib/kubelet/pods/5c623f99-9905-4a7e-addc-022e48fb40bf/volumes" Jan 28 07:16:55 crc kubenswrapper[4776]: I0128 07:16:55.327582 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6adc8ae9-dfbe-4c04-b844-b2fb424bda14" path="/var/lib/kubelet/pods/6adc8ae9-dfbe-4c04-b844-b2fb424bda14/volumes" Jan 28 07:16:55 crc kubenswrapper[4776]: I0128 07:16:55.329094 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d2eae23-aa55-4a27-a54c-b58d28da7b56" path="/var/lib/kubelet/pods/6d2eae23-aa55-4a27-a54c-b58d28da7b56/volumes" Jan 28 07:16:55 crc kubenswrapper[4776]: I0128 07:16:55.331282 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8bcf822-d4ff-4cb2-b783-cea86d58ab8e" path="/var/lib/kubelet/pods/a8bcf822-d4ff-4cb2-b783-cea86d58ab8e/volumes" Jan 28 07:16:55 crc kubenswrapper[4776]: I0128 07:16:55.332726 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c92552a8-f85e-44d3-9b6f-d3614a6bc92a" path="/var/lib/kubelet/pods/c92552a8-f85e-44d3-9b6f-d3614a6bc92a/volumes" Jan 28 07:16:57 crc kubenswrapper[4776]: I0128 07:16:57.033412 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-1f0e-account-create-update-rrjzl"] Jan 28 07:16:57 crc kubenswrapper[4776]: I0128 07:16:57.046703 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-9c8kt"] Jan 28 07:16:57 crc kubenswrapper[4776]: I0128 07:16:57.056582 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-9c8kt"] Jan 28 07:16:57 crc kubenswrapper[4776]: I0128 07:16:57.069154 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-1f0e-account-create-update-rrjzl"] Jan 28 07:16:57 crc kubenswrapper[4776]: I0128 07:16:57.316259 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a015c10-32ee-47e7-a5f4-21d5ccbe6e89" path="/var/lib/kubelet/pods/5a015c10-32ee-47e7-a5f4-21d5ccbe6e89/volumes" Jan 28 07:16:57 crc kubenswrapper[4776]: I0128 07:16:57.316955 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c4e7e46-a88a-44c2-8679-550be504407e" path="/var/lib/kubelet/pods/5c4e7e46-a88a-44c2-8679-550be504407e/volumes" Jan 28 07:17:19 crc kubenswrapper[4776]: I0128 07:17:19.054114 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-s2gfx"] Jan 28 07:17:19 crc kubenswrapper[4776]: I0128 07:17:19.066460 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-s2gfx"] Jan 28 07:17:19 crc kubenswrapper[4776]: I0128 07:17:19.333703 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8e5e104-629f-43f5-8372-dbe94e3938af" path="/var/lib/kubelet/pods/d8e5e104-629f-43f5-8372-dbe94e3938af/volumes" Jan 28 07:17:27 crc kubenswrapper[4776]: I0128 07:17:27.043673 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-nskdh"] Jan 28 07:17:27 crc kubenswrapper[4776]: I0128 07:17:27.054438 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-nskdh"] Jan 28 07:17:27 crc kubenswrapper[4776]: I0128 07:17:27.314820 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff88b930-92ae-409c-9365-c9a0131558cb" path="/var/lib/kubelet/pods/ff88b930-92ae-409c-9365-c9a0131558cb/volumes" Jan 28 07:17:31 crc kubenswrapper[4776]: I0128 07:17:31.050636 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-1464-account-create-update-6k48w"] Jan 28 07:17:31 crc kubenswrapper[4776]: I0128 07:17:31.069412 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-6ljzs"] Jan 28 07:17:31 crc kubenswrapper[4776]: I0128 07:17:31.079646 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-1464-account-create-update-6k48w"] Jan 28 07:17:31 crc kubenswrapper[4776]: I0128 07:17:31.090411 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-6ljzs"] Jan 28 07:17:31 crc kubenswrapper[4776]: I0128 07:17:31.317541 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19c08552-4143-40ef-bb55-76c8d5146a7c" path="/var/lib/kubelet/pods/19c08552-4143-40ef-bb55-76c8d5146a7c/volumes" Jan 28 07:17:31 crc kubenswrapper[4776]: I0128 07:17:31.318808 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de153a8c-d3be-4575-bc85-4d4b08bdf05c" path="/var/lib/kubelet/pods/de153a8c-d3be-4575-bc85-4d4b08bdf05c/volumes" Jan 28 07:17:33 crc kubenswrapper[4776]: I0128 07:17:33.852668 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:17:33 crc kubenswrapper[4776]: I0128 07:17:33.852971 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:17:35 crc kubenswrapper[4776]: I0128 07:17:35.059641 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-bfdc-account-create-update-bzmmp"] Jan 28 07:17:35 crc kubenswrapper[4776]: I0128 07:17:35.080875 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d143-account-create-update-2s4kf"] Jan 28 07:17:35 crc kubenswrapper[4776]: I0128 07:17:35.094406 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-bmgbw"] Jan 28 07:17:35 crc kubenswrapper[4776]: I0128 07:17:35.102352 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d143-account-create-update-2s4kf"] Jan 28 07:17:35 crc kubenswrapper[4776]: I0128 07:17:35.109819 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-bmgbw"] Jan 28 07:17:35 crc kubenswrapper[4776]: I0128 07:17:35.117839 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-bfdc-account-create-update-bzmmp"] Jan 28 07:17:35 crc kubenswrapper[4776]: I0128 07:17:35.319010 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54d514d3-c80d-400a-adaa-2b7adf96aab8" path="/var/lib/kubelet/pods/54d514d3-c80d-400a-adaa-2b7adf96aab8/volumes" Jan 28 07:17:35 crc kubenswrapper[4776]: I0128 07:17:35.320756 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2c47ae6-fdae-4e7c-b4d3-a8faf292b443" path="/var/lib/kubelet/pods/a2c47ae6-fdae-4e7c-b4d3-a8faf292b443/volumes" Jan 28 07:17:35 crc kubenswrapper[4776]: I0128 07:17:35.322269 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0d5e297-fedb-45dc-a861-423f2cbe5700" path="/var/lib/kubelet/pods/e0d5e297-fedb-45dc-a861-423f2cbe5700/volumes" Jan 28 07:17:45 crc kubenswrapper[4776]: I0128 07:17:45.047283 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-z6chw"] Jan 28 07:17:45 crc kubenswrapper[4776]: I0128 07:17:45.067399 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-z6chw"] Jan 28 07:17:45 crc kubenswrapper[4776]: I0128 07:17:45.322717 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9e67cfe-459a-4e26-99d6-302c7614acac" path="/var/lib/kubelet/pods/a9e67cfe-459a-4e26-99d6-302c7614acac/volumes" Jan 28 07:17:46 crc kubenswrapper[4776]: I0128 07:17:46.719531 4776 scope.go:117] "RemoveContainer" containerID="c93e9719d7d883ae856c1836e3b2090bdb5a3fdec5cfe4326e6af01888ea04dd" Jan 28 07:17:46 crc kubenswrapper[4776]: I0128 07:17:46.763817 4776 scope.go:117] "RemoveContainer" containerID="4f40bcd3111dd5a48ad78e3666f60f367d1016eb6651d7d637e676fb1c5677a8" Jan 28 07:17:46 crc kubenswrapper[4776]: I0128 07:17:46.818249 4776 scope.go:117] "RemoveContainer" containerID="79080a7b53a9f2a322741aa184fbd559b9a54427bb29bd067346407e0c890255" Jan 28 07:17:46 crc kubenswrapper[4776]: I0128 07:17:46.857664 4776 scope.go:117] "RemoveContainer" containerID="6a3eea09d446cfa1d0df82cc2e18518c23710a2669ad38f164b55ee1fbd660f7" Jan 28 07:17:46 crc kubenswrapper[4776]: I0128 07:17:46.913686 4776 scope.go:117] "RemoveContainer" containerID="34199f85b5df395c55405b41e3e0ee84b1e4189317638f251886597895a61300" Jan 28 07:17:46 crc kubenswrapper[4776]: I0128 07:17:46.960023 4776 scope.go:117] "RemoveContainer" containerID="428a6bfbb182fcaa6f22bf059573f8b37e01a695e531100ac270f0987a79d309" Jan 28 07:17:47 crc kubenswrapper[4776]: I0128 07:17:47.010899 4776 scope.go:117] "RemoveContainer" containerID="7c311c8d46ebc562fde60e40bb783e27af644bbe89d573f240b1b41301091aef" Jan 28 07:17:47 crc kubenswrapper[4776]: I0128 07:17:47.031247 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-lm5q5"] Jan 28 07:17:47 crc kubenswrapper[4776]: I0128 07:17:47.040792 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-lm5q5"] Jan 28 07:17:47 crc kubenswrapper[4776]: I0128 07:17:47.044671 4776 scope.go:117] "RemoveContainer" containerID="abb8aa8983884cf9ae6e6f7d8cd64e47f7f5f5e7d22d3522ad1d1070717ca077" Jan 28 07:17:47 crc kubenswrapper[4776]: I0128 07:17:47.085818 4776 scope.go:117] "RemoveContainer" containerID="c69eb5fe3d965dc3069a35c58841cf4530b29ae95d561ab053761c41e24e4665" Jan 28 07:17:47 crc kubenswrapper[4776]: I0128 07:17:47.112368 4776 scope.go:117] "RemoveContainer" containerID="9407c576485ccdcd24787765b27e173c32a324fd9d1c97fec5310897541eb1f2" Jan 28 07:17:47 crc kubenswrapper[4776]: I0128 07:17:47.155785 4776 scope.go:117] "RemoveContainer" containerID="3e7f014b1fd104019485fa6d9d39c40d911d6966e2d7759bd0b4caed6a300aee" Jan 28 07:17:47 crc kubenswrapper[4776]: I0128 07:17:47.183481 4776 scope.go:117] "RemoveContainer" containerID="8fba4c3d1b4dfd12be137fcdca0b52c484e61f02dc4676663dfd49f415cabc4b" Jan 28 07:17:47 crc kubenswrapper[4776]: I0128 07:17:47.202071 4776 scope.go:117] "RemoveContainer" containerID="f1c6280e1c6756604e7d1fe0f44835aca83672edbe2e48eb3d530dd99a371d51" Jan 28 07:17:47 crc kubenswrapper[4776]: I0128 07:17:47.221638 4776 scope.go:117] "RemoveContainer" containerID="5d6c0273b6bbbd7bf666f8f1e1af4130d25302f353b0d9b75c92c540ea4b0c39" Jan 28 07:17:47 crc kubenswrapper[4776]: I0128 07:17:47.241216 4776 scope.go:117] "RemoveContainer" containerID="606ef16b590f9ea12fb51c089298405f243cb93b27b34961bf04e0cd21fe489b" Jan 28 07:17:47 crc kubenswrapper[4776]: I0128 07:17:47.259849 4776 scope.go:117] "RemoveContainer" containerID="0b48be84415e47cb5cd1cd00f597d80e72f15ae853414ecd14142a591dc802d0" Jan 28 07:17:47 crc kubenswrapper[4776]: I0128 07:17:47.322944 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03934a8f-5036-4cf4-9dea-8f1bde73b2d7" path="/var/lib/kubelet/pods/03934a8f-5036-4cf4-9dea-8f1bde73b2d7/volumes" Jan 28 07:18:03 crc kubenswrapper[4776]: I0128 07:18:03.852507 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:18:03 crc kubenswrapper[4776]: I0128 07:18:03.854705 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:18:06 crc kubenswrapper[4776]: I0128 07:18:06.967353 4776 generic.go:334] "Generic (PLEG): container finished" podID="ea52630b-ebcc-41d5-9265-eec1e8ae437d" containerID="f6abd2b6acb94cf279e6215537b7b485f529c4d7187eed46bd600dd72654cc57" exitCode=0 Jan 28 07:18:06 crc kubenswrapper[4776]: I0128 07:18:06.967476 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" event={"ID":"ea52630b-ebcc-41d5-9265-eec1e8ae437d","Type":"ContainerDied","Data":"f6abd2b6acb94cf279e6215537b7b485f529c4d7187eed46bd600dd72654cc57"} Jan 28 07:18:08 crc kubenswrapper[4776]: I0128 07:18:08.518118 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" Jan 28 07:18:08 crc kubenswrapper[4776]: I0128 07:18:08.688037 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea52630b-ebcc-41d5-9265-eec1e8ae437d-ssh-key-openstack-edpm-ipam\") pod \"ea52630b-ebcc-41d5-9265-eec1e8ae437d\" (UID: \"ea52630b-ebcc-41d5-9265-eec1e8ae437d\") " Jan 28 07:18:08 crc kubenswrapper[4776]: I0128 07:18:08.688390 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkrvx\" (UniqueName: \"kubernetes.io/projected/ea52630b-ebcc-41d5-9265-eec1e8ae437d-kube-api-access-pkrvx\") pod \"ea52630b-ebcc-41d5-9265-eec1e8ae437d\" (UID: \"ea52630b-ebcc-41d5-9265-eec1e8ae437d\") " Jan 28 07:18:08 crc kubenswrapper[4776]: I0128 07:18:08.688527 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea52630b-ebcc-41d5-9265-eec1e8ae437d-inventory\") pod \"ea52630b-ebcc-41d5-9265-eec1e8ae437d\" (UID: \"ea52630b-ebcc-41d5-9265-eec1e8ae437d\") " Jan 28 07:18:08 crc kubenswrapper[4776]: I0128 07:18:08.694509 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea52630b-ebcc-41d5-9265-eec1e8ae437d-kube-api-access-pkrvx" (OuterVolumeSpecName: "kube-api-access-pkrvx") pod "ea52630b-ebcc-41d5-9265-eec1e8ae437d" (UID: "ea52630b-ebcc-41d5-9265-eec1e8ae437d"). InnerVolumeSpecName "kube-api-access-pkrvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:18:08 crc kubenswrapper[4776]: I0128 07:18:08.725019 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea52630b-ebcc-41d5-9265-eec1e8ae437d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ea52630b-ebcc-41d5-9265-eec1e8ae437d" (UID: "ea52630b-ebcc-41d5-9265-eec1e8ae437d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:18:08 crc kubenswrapper[4776]: I0128 07:18:08.726493 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea52630b-ebcc-41d5-9265-eec1e8ae437d-inventory" (OuterVolumeSpecName: "inventory") pod "ea52630b-ebcc-41d5-9265-eec1e8ae437d" (UID: "ea52630b-ebcc-41d5-9265-eec1e8ae437d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:18:08 crc kubenswrapper[4776]: I0128 07:18:08.791017 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea52630b-ebcc-41d5-9265-eec1e8ae437d-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 07:18:08 crc kubenswrapper[4776]: I0128 07:18:08.791052 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea52630b-ebcc-41d5-9265-eec1e8ae437d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:18:08 crc kubenswrapper[4776]: I0128 07:18:08.791067 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkrvx\" (UniqueName: \"kubernetes.io/projected/ea52630b-ebcc-41d5-9265-eec1e8ae437d-kube-api-access-pkrvx\") on node \"crc\" DevicePath \"\"" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.000525 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" event={"ID":"ea52630b-ebcc-41d5-9265-eec1e8ae437d","Type":"ContainerDied","Data":"a1b1e5a058efb3dfd2d5cdd04cc531baab0277488ade1aad7f3cc558593c337b"} Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.000595 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1b1e5a058efb3dfd2d5cdd04cc531baab0277488ade1aad7f3cc558593c337b" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.001053 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.091336 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h"] Jan 28 07:18:09 crc kubenswrapper[4776]: E0128 07:18:09.091841 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ce633a-e963-458c-bbed-900cc02cedbb" containerName="extract-content" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.091883 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ce633a-e963-458c-bbed-900cc02cedbb" containerName="extract-content" Jan 28 07:18:09 crc kubenswrapper[4776]: E0128 07:18:09.091920 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ce633a-e963-458c-bbed-900cc02cedbb" containerName="extract-utilities" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.091931 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ce633a-e963-458c-bbed-900cc02cedbb" containerName="extract-utilities" Jan 28 07:18:09 crc kubenswrapper[4776]: E0128 07:18:09.091949 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea52630b-ebcc-41d5-9265-eec1e8ae437d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.091958 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea52630b-ebcc-41d5-9265-eec1e8ae437d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 28 07:18:09 crc kubenswrapper[4776]: E0128 07:18:09.091976 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ce633a-e963-458c-bbed-900cc02cedbb" containerName="registry-server" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.091984 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ce633a-e963-458c-bbed-900cc02cedbb" containerName="registry-server" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.092198 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea52630b-ebcc-41d5-9265-eec1e8ae437d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.092216 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ce633a-e963-458c-bbed-900cc02cedbb" containerName="registry-server" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.093045 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.095757 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.096125 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cl6qn" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.096296 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.096178 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.129641 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h"] Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.200736 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aafcd74c-ce06-4b5f-a858-ed32676f7503-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w925h\" (UID: \"aafcd74c-ce06-4b5f-a858-ed32676f7503\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.200813 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aafcd74c-ce06-4b5f-a858-ed32676f7503-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w925h\" (UID: \"aafcd74c-ce06-4b5f-a858-ed32676f7503\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.201068 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-668l2\" (UniqueName: \"kubernetes.io/projected/aafcd74c-ce06-4b5f-a858-ed32676f7503-kube-api-access-668l2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w925h\" (UID: \"aafcd74c-ce06-4b5f-a858-ed32676f7503\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.302382 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aafcd74c-ce06-4b5f-a858-ed32676f7503-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w925h\" (UID: \"aafcd74c-ce06-4b5f-a858-ed32676f7503\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.302446 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aafcd74c-ce06-4b5f-a858-ed32676f7503-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w925h\" (UID: \"aafcd74c-ce06-4b5f-a858-ed32676f7503\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.302522 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-668l2\" (UniqueName: \"kubernetes.io/projected/aafcd74c-ce06-4b5f-a858-ed32676f7503-kube-api-access-668l2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w925h\" (UID: \"aafcd74c-ce06-4b5f-a858-ed32676f7503\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.307497 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aafcd74c-ce06-4b5f-a858-ed32676f7503-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w925h\" (UID: \"aafcd74c-ce06-4b5f-a858-ed32676f7503\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.307496 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aafcd74c-ce06-4b5f-a858-ed32676f7503-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w925h\" (UID: \"aafcd74c-ce06-4b5f-a858-ed32676f7503\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.319766 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-668l2\" (UniqueName: \"kubernetes.io/projected/aafcd74c-ce06-4b5f-a858-ed32676f7503-kube-api-access-668l2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w925h\" (UID: \"aafcd74c-ce06-4b5f-a858-ed32676f7503\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.427770 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" Jan 28 07:18:09 crc kubenswrapper[4776]: I0128 07:18:09.979775 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h"] Jan 28 07:18:10 crc kubenswrapper[4776]: I0128 07:18:10.010079 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" event={"ID":"aafcd74c-ce06-4b5f-a858-ed32676f7503","Type":"ContainerStarted","Data":"68d2bafa444cd29dc1a0d89f9187671a92762bbe609693bee6329f4d983168da"} Jan 28 07:18:11 crc kubenswrapper[4776]: I0128 07:18:11.022816 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" event={"ID":"aafcd74c-ce06-4b5f-a858-ed32676f7503","Type":"ContainerStarted","Data":"2e07c1c265d91435b19a9e394b4a983902a316bd92cf720c49ec2da1b03b0bb5"} Jan 28 07:18:16 crc kubenswrapper[4776]: I0128 07:18:16.033507 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" podStartSLOduration=6.5593286840000005 podStartE2EDuration="7.033478004s" podCreationTimestamp="2026-01-28 07:18:09 +0000 UTC" firstStartedPulling="2026-01-28 07:18:09.981748365 +0000 UTC m=+1661.397408525" lastFinishedPulling="2026-01-28 07:18:10.455897665 +0000 UTC m=+1661.871557845" observedRunningTime="2026-01-28 07:18:11.045652729 +0000 UTC m=+1662.461312919" watchObservedRunningTime="2026-01-28 07:18:16.033478004 +0000 UTC m=+1667.449138184" Jan 28 07:18:16 crc kubenswrapper[4776]: I0128 07:18:16.040440 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-q4gsk"] Jan 28 07:18:16 crc kubenswrapper[4776]: I0128 07:18:16.060920 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-q4gsk"] Jan 28 07:18:17 crc kubenswrapper[4776]: I0128 07:18:17.323604 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd2e30fd-49a7-4182-8e64-72c01a2394d4" path="/var/lib/kubelet/pods/dd2e30fd-49a7-4182-8e64-72c01a2394d4/volumes" Jan 28 07:18:28 crc kubenswrapper[4776]: I0128 07:18:28.057471 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5cgcf"] Jan 28 07:18:28 crc kubenswrapper[4776]: I0128 07:18:28.071108 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5cgcf"] Jan 28 07:18:29 crc kubenswrapper[4776]: I0128 07:18:29.332682 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1eea745-adc8-4e45-b52a-48190c7572b1" path="/var/lib/kubelet/pods/e1eea745-adc8-4e45-b52a-48190c7572b1/volumes" Jan 28 07:18:31 crc kubenswrapper[4776]: I0128 07:18:31.036300 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-c8whn"] Jan 28 07:18:31 crc kubenswrapper[4776]: I0128 07:18:31.054997 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-kqfn7"] Jan 28 07:18:31 crc kubenswrapper[4776]: I0128 07:18:31.077062 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-c8whn"] Jan 28 07:18:31 crc kubenswrapper[4776]: I0128 07:18:31.087881 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-kqfn7"] Jan 28 07:18:31 crc kubenswrapper[4776]: I0128 07:18:31.320827 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6728fb0a-d0b8-4fd0-970e-7e5e496ecd03" path="/var/lib/kubelet/pods/6728fb0a-d0b8-4fd0-970e-7e5e496ecd03/volumes" Jan 28 07:18:31 crc kubenswrapper[4776]: I0128 07:18:31.321607 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92f68261-4c2f-49dd-84b6-ee2dbd1dc36e" path="/var/lib/kubelet/pods/92f68261-4c2f-49dd-84b6-ee2dbd1dc36e/volumes" Jan 28 07:18:33 crc kubenswrapper[4776]: I0128 07:18:33.852217 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:18:33 crc kubenswrapper[4776]: I0128 07:18:33.852605 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:18:33 crc kubenswrapper[4776]: I0128 07:18:33.852663 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 07:18:33 crc kubenswrapper[4776]: I0128 07:18:33.853486 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 07:18:33 crc kubenswrapper[4776]: I0128 07:18:33.853560 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" gracePeriod=600 Jan 28 07:18:33 crc kubenswrapper[4776]: E0128 07:18:33.986253 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:18:34 crc kubenswrapper[4776]: I0128 07:18:34.255601 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" exitCode=0 Jan 28 07:18:34 crc kubenswrapper[4776]: I0128 07:18:34.255651 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b"} Jan 28 07:18:34 crc kubenswrapper[4776]: I0128 07:18:34.255686 4776 scope.go:117] "RemoveContainer" containerID="8ff95d3106ec58750562936f2aa4128ad082c64114f36d54205fbce24b521f3d" Jan 28 07:18:34 crc kubenswrapper[4776]: I0128 07:18:34.256125 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:18:34 crc kubenswrapper[4776]: E0128 07:18:34.256402 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:18:37 crc kubenswrapper[4776]: I0128 07:18:37.038710 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-fg2nt"] Jan 28 07:18:37 crc kubenswrapper[4776]: I0128 07:18:37.055024 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-fg2nt"] Jan 28 07:18:37 crc kubenswrapper[4776]: I0128 07:18:37.318683 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6377f80e-0b32-479e-b33c-fc4d9f67b299" path="/var/lib/kubelet/pods/6377f80e-0b32-479e-b33c-fc4d9f67b299/volumes" Jan 28 07:18:44 crc kubenswrapper[4776]: I0128 07:18:44.038531 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-zxp7l"] Jan 28 07:18:44 crc kubenswrapper[4776]: I0128 07:18:44.058802 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-zxp7l"] Jan 28 07:18:45 crc kubenswrapper[4776]: I0128 07:18:45.331082 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c1400af-1c32-4f74-89f8-30b42dbb6c91" path="/var/lib/kubelet/pods/2c1400af-1c32-4f74-89f8-30b42dbb6c91/volumes" Jan 28 07:18:47 crc kubenswrapper[4776]: I0128 07:18:47.304942 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:18:47 crc kubenswrapper[4776]: E0128 07:18:47.305666 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:18:47 crc kubenswrapper[4776]: I0128 07:18:47.522905 4776 scope.go:117] "RemoveContainer" containerID="5597d77ece7eef3736c8b6f7018e9ea1484bc323b436166a843fa2f82c7a7899" Jan 28 07:18:47 crc kubenswrapper[4776]: I0128 07:18:47.562168 4776 scope.go:117] "RemoveContainer" containerID="e067161ef4269cdad93881fc154e79b54eb2dd197ccd2d0bc4ef018370a84f91" Jan 28 07:18:47 crc kubenswrapper[4776]: I0128 07:18:47.628910 4776 scope.go:117] "RemoveContainer" containerID="a65362a05306196ca9edebfae4fe8396e76494779bb8a4cfe59d0edcaa6fc49e" Jan 28 07:18:47 crc kubenswrapper[4776]: I0128 07:18:47.661846 4776 scope.go:117] "RemoveContainer" containerID="d2b7bda6eb34b7f0aafe525e62ba6f60edf67f90dfa6a837e836ddbe372eee32" Jan 28 07:18:47 crc kubenswrapper[4776]: I0128 07:18:47.711936 4776 scope.go:117] "RemoveContainer" containerID="e3e40e8e3d347ea5c8c72bdefb22ddda79422d389b6094b60f9668c9128dfb32" Jan 28 07:18:47 crc kubenswrapper[4776]: I0128 07:18:47.776873 4776 scope.go:117] "RemoveContainer" containerID="1cb617be81d77b7ccfab7ea704ef6632cf07a245eaad455c79329a67813a41cc" Jan 28 07:18:47 crc kubenswrapper[4776]: I0128 07:18:47.813411 4776 scope.go:117] "RemoveContainer" containerID="5d4bb662b1d9ee7c5bff8bb2f4273626db7157fd8e1ebdf951d5be207f2c737e" Jan 28 07:19:00 crc kubenswrapper[4776]: I0128 07:19:00.304417 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:19:00 crc kubenswrapper[4776]: E0128 07:19:00.305433 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:19:12 crc kubenswrapper[4776]: I0128 07:19:12.305057 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:19:12 crc kubenswrapper[4776]: E0128 07:19:12.306283 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:19:20 crc kubenswrapper[4776]: I0128 07:19:20.068797 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-36c2-account-create-update-mdsmj"] Jan 28 07:19:20 crc kubenswrapper[4776]: I0128 07:19:20.085050 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-36c2-account-create-update-mdsmj"] Jan 28 07:19:21 crc kubenswrapper[4776]: I0128 07:19:21.077596 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8caa-account-create-update-jcswg"] Jan 28 07:19:21 crc kubenswrapper[4776]: I0128 07:19:21.093460 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-4knk8"] Jan 28 07:19:21 crc kubenswrapper[4776]: I0128 07:19:21.112070 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-rh4ps"] Jan 28 07:19:21 crc kubenswrapper[4776]: I0128 07:19:21.121745 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c3a3-account-create-update-xq2r2"] Jan 28 07:19:21 crc kubenswrapper[4776]: I0128 07:19:21.130204 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-79sh9"] Jan 28 07:19:21 crc kubenswrapper[4776]: I0128 07:19:21.137772 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8caa-account-create-update-jcswg"] Jan 28 07:19:21 crc kubenswrapper[4776]: I0128 07:19:21.145005 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-4knk8"] Jan 28 07:19:21 crc kubenswrapper[4776]: I0128 07:19:21.152437 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-rh4ps"] Jan 28 07:19:21 crc kubenswrapper[4776]: I0128 07:19:21.159660 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c3a3-account-create-update-xq2r2"] Jan 28 07:19:21 crc kubenswrapper[4776]: I0128 07:19:21.167270 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-79sh9"] Jan 28 07:19:21 crc kubenswrapper[4776]: I0128 07:19:21.318124 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14baefd8-d74d-44bd-83a9-64f6d8a71fbe" path="/var/lib/kubelet/pods/14baefd8-d74d-44bd-83a9-64f6d8a71fbe/volumes" Jan 28 07:19:21 crc kubenswrapper[4776]: I0128 07:19:21.320101 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="435a9994-8725-46d9-ad26-4cf3179a61e9" path="/var/lib/kubelet/pods/435a9994-8725-46d9-ad26-4cf3179a61e9/volumes" Jan 28 07:19:21 crc kubenswrapper[4776]: I0128 07:19:21.321468 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7649c00f-e457-48f3-8d5c-28ca197fb663" path="/var/lib/kubelet/pods/7649c00f-e457-48f3-8d5c-28ca197fb663/volumes" Jan 28 07:19:21 crc kubenswrapper[4776]: I0128 07:19:21.322805 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8aa4600-018d-4394-a873-92af1c70b5ba" path="/var/lib/kubelet/pods/c8aa4600-018d-4394-a873-92af1c70b5ba/volumes" Jan 28 07:19:21 crc kubenswrapper[4776]: I0128 07:19:21.324446 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d17a5a2d-48f9-4ebb-a103-1c9e92d82f41" path="/var/lib/kubelet/pods/d17a5a2d-48f9-4ebb-a103-1c9e92d82f41/volumes" Jan 28 07:19:21 crc kubenswrapper[4776]: I0128 07:19:21.325009 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46" path="/var/lib/kubelet/pods/dfd15c75-c3c3-40a4-9c90-1caeeb7d6b46/volumes" Jan 28 07:19:25 crc kubenswrapper[4776]: I0128 07:19:25.305749 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:19:25 crc kubenswrapper[4776]: E0128 07:19:25.307787 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:19:27 crc kubenswrapper[4776]: I0128 07:19:27.808467 4776 generic.go:334] "Generic (PLEG): container finished" podID="aafcd74c-ce06-4b5f-a858-ed32676f7503" containerID="2e07c1c265d91435b19a9e394b4a983902a316bd92cf720c49ec2da1b03b0bb5" exitCode=0 Jan 28 07:19:27 crc kubenswrapper[4776]: I0128 07:19:27.809117 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" event={"ID":"aafcd74c-ce06-4b5f-a858-ed32676f7503","Type":"ContainerDied","Data":"2e07c1c265d91435b19a9e394b4a983902a316bd92cf720c49ec2da1b03b0bb5"} Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.318798 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.390278 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aafcd74c-ce06-4b5f-a858-ed32676f7503-inventory\") pod \"aafcd74c-ce06-4b5f-a858-ed32676f7503\" (UID: \"aafcd74c-ce06-4b5f-a858-ed32676f7503\") " Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.390674 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-668l2\" (UniqueName: \"kubernetes.io/projected/aafcd74c-ce06-4b5f-a858-ed32676f7503-kube-api-access-668l2\") pod \"aafcd74c-ce06-4b5f-a858-ed32676f7503\" (UID: \"aafcd74c-ce06-4b5f-a858-ed32676f7503\") " Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.390892 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aafcd74c-ce06-4b5f-a858-ed32676f7503-ssh-key-openstack-edpm-ipam\") pod \"aafcd74c-ce06-4b5f-a858-ed32676f7503\" (UID: \"aafcd74c-ce06-4b5f-a858-ed32676f7503\") " Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.396435 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aafcd74c-ce06-4b5f-a858-ed32676f7503-kube-api-access-668l2" (OuterVolumeSpecName: "kube-api-access-668l2") pod "aafcd74c-ce06-4b5f-a858-ed32676f7503" (UID: "aafcd74c-ce06-4b5f-a858-ed32676f7503"). InnerVolumeSpecName "kube-api-access-668l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.417357 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aafcd74c-ce06-4b5f-a858-ed32676f7503-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "aafcd74c-ce06-4b5f-a858-ed32676f7503" (UID: "aafcd74c-ce06-4b5f-a858-ed32676f7503"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.417807 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aafcd74c-ce06-4b5f-a858-ed32676f7503-inventory" (OuterVolumeSpecName: "inventory") pod "aafcd74c-ce06-4b5f-a858-ed32676f7503" (UID: "aafcd74c-ce06-4b5f-a858-ed32676f7503"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.493066 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aafcd74c-ce06-4b5f-a858-ed32676f7503-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.493094 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-668l2\" (UniqueName: \"kubernetes.io/projected/aafcd74c-ce06-4b5f-a858-ed32676f7503-kube-api-access-668l2\") on node \"crc\" DevicePath \"\"" Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.493104 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aafcd74c-ce06-4b5f-a858-ed32676f7503-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.838153 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" event={"ID":"aafcd74c-ce06-4b5f-a858-ed32676f7503","Type":"ContainerDied","Data":"68d2bafa444cd29dc1a0d89f9187671a92762bbe609693bee6329f4d983168da"} Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.838204 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68d2bafa444cd29dc1a0d89f9187671a92762bbe609693bee6329f4d983168da" Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.838246 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w925h" Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.965530 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b"] Jan 28 07:19:29 crc kubenswrapper[4776]: E0128 07:19:29.966288 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aafcd74c-ce06-4b5f-a858-ed32676f7503" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.966322 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="aafcd74c-ce06-4b5f-a858-ed32676f7503" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.966731 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="aafcd74c-ce06-4b5f-a858-ed32676f7503" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.967863 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.970380 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cl6qn" Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.971458 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.971663 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.972228 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 07:19:29 crc kubenswrapper[4776]: I0128 07:19:29.989629 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b"] Jan 28 07:19:30 crc kubenswrapper[4776]: I0128 07:19:30.107670 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88f7f\" (UniqueName: \"kubernetes.io/projected/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-kube-api-access-88f7f\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b\" (UID: \"cb26e5c4-e4de-4bca-86c0-160dffb2bb73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" Jan 28 07:19:30 crc kubenswrapper[4776]: I0128 07:19:30.107840 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b\" (UID: \"cb26e5c4-e4de-4bca-86c0-160dffb2bb73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" Jan 28 07:19:30 crc kubenswrapper[4776]: I0128 07:19:30.107953 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b\" (UID: \"cb26e5c4-e4de-4bca-86c0-160dffb2bb73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" Jan 28 07:19:30 crc kubenswrapper[4776]: I0128 07:19:30.210971 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b\" (UID: \"cb26e5c4-e4de-4bca-86c0-160dffb2bb73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" Jan 28 07:19:30 crc kubenswrapper[4776]: I0128 07:19:30.211317 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88f7f\" (UniqueName: \"kubernetes.io/projected/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-kube-api-access-88f7f\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b\" (UID: \"cb26e5c4-e4de-4bca-86c0-160dffb2bb73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" Jan 28 07:19:30 crc kubenswrapper[4776]: I0128 07:19:30.211449 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b\" (UID: \"cb26e5c4-e4de-4bca-86c0-160dffb2bb73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" Jan 28 07:19:30 crc kubenswrapper[4776]: I0128 07:19:30.220588 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b\" (UID: \"cb26e5c4-e4de-4bca-86c0-160dffb2bb73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" Jan 28 07:19:30 crc kubenswrapper[4776]: I0128 07:19:30.230084 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b\" (UID: \"cb26e5c4-e4de-4bca-86c0-160dffb2bb73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" Jan 28 07:19:30 crc kubenswrapper[4776]: I0128 07:19:30.239342 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88f7f\" (UniqueName: \"kubernetes.io/projected/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-kube-api-access-88f7f\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b\" (UID: \"cb26e5c4-e4de-4bca-86c0-160dffb2bb73\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" Jan 28 07:19:30 crc kubenswrapper[4776]: I0128 07:19:30.293244 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" Jan 28 07:19:30 crc kubenswrapper[4776]: I0128 07:19:30.703860 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b"] Jan 28 07:19:30 crc kubenswrapper[4776]: I0128 07:19:30.851700 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" event={"ID":"cb26e5c4-e4de-4bca-86c0-160dffb2bb73","Type":"ContainerStarted","Data":"99be99ce5d65774a031f64ec0a7c69deb80007aafa27315ed62d1465d4af9b8a"} Jan 28 07:19:31 crc kubenswrapper[4776]: I0128 07:19:31.865838 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" event={"ID":"cb26e5c4-e4de-4bca-86c0-160dffb2bb73","Type":"ContainerStarted","Data":"b6469b9e2356ca835023726107306c8b122691662b9c14280a563913cb28ac4f"} Jan 28 07:19:31 crc kubenswrapper[4776]: I0128 07:19:31.895742 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" podStartSLOduration=2.387639697 podStartE2EDuration="2.895716369s" podCreationTimestamp="2026-01-28 07:19:29 +0000 UTC" firstStartedPulling="2026-01-28 07:19:30.699185562 +0000 UTC m=+1742.114845732" lastFinishedPulling="2026-01-28 07:19:31.207262204 +0000 UTC m=+1742.622922404" observedRunningTime="2026-01-28 07:19:31.883778037 +0000 UTC m=+1743.299438237" watchObservedRunningTime="2026-01-28 07:19:31.895716369 +0000 UTC m=+1743.311376569" Jan 28 07:19:36 crc kubenswrapper[4776]: I0128 07:19:36.304834 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:19:36 crc kubenswrapper[4776]: E0128 07:19:36.305790 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:19:36 crc kubenswrapper[4776]: I0128 07:19:36.923841 4776 generic.go:334] "Generic (PLEG): container finished" podID="cb26e5c4-e4de-4bca-86c0-160dffb2bb73" containerID="b6469b9e2356ca835023726107306c8b122691662b9c14280a563913cb28ac4f" exitCode=0 Jan 28 07:19:36 crc kubenswrapper[4776]: I0128 07:19:36.923906 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" event={"ID":"cb26e5c4-e4de-4bca-86c0-160dffb2bb73","Type":"ContainerDied","Data":"b6469b9e2356ca835023726107306c8b122691662b9c14280a563913cb28ac4f"} Jan 28 07:19:38 crc kubenswrapper[4776]: I0128 07:19:38.432482 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" Jan 28 07:19:38 crc kubenswrapper[4776]: I0128 07:19:38.619227 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-inventory\") pod \"cb26e5c4-e4de-4bca-86c0-160dffb2bb73\" (UID: \"cb26e5c4-e4de-4bca-86c0-160dffb2bb73\") " Jan 28 07:19:38 crc kubenswrapper[4776]: I0128 07:19:38.619286 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88f7f\" (UniqueName: \"kubernetes.io/projected/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-kube-api-access-88f7f\") pod \"cb26e5c4-e4de-4bca-86c0-160dffb2bb73\" (UID: \"cb26e5c4-e4de-4bca-86c0-160dffb2bb73\") " Jan 28 07:19:38 crc kubenswrapper[4776]: I0128 07:19:38.619468 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-ssh-key-openstack-edpm-ipam\") pod \"cb26e5c4-e4de-4bca-86c0-160dffb2bb73\" (UID: \"cb26e5c4-e4de-4bca-86c0-160dffb2bb73\") " Jan 28 07:19:38 crc kubenswrapper[4776]: I0128 07:19:38.627525 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-kube-api-access-88f7f" (OuterVolumeSpecName: "kube-api-access-88f7f") pod "cb26e5c4-e4de-4bca-86c0-160dffb2bb73" (UID: "cb26e5c4-e4de-4bca-86c0-160dffb2bb73"). InnerVolumeSpecName "kube-api-access-88f7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:19:38 crc kubenswrapper[4776]: I0128 07:19:38.658167 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cb26e5c4-e4de-4bca-86c0-160dffb2bb73" (UID: "cb26e5c4-e4de-4bca-86c0-160dffb2bb73"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:19:38 crc kubenswrapper[4776]: I0128 07:19:38.660871 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-inventory" (OuterVolumeSpecName: "inventory") pod "cb26e5c4-e4de-4bca-86c0-160dffb2bb73" (UID: "cb26e5c4-e4de-4bca-86c0-160dffb2bb73"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:19:38 crc kubenswrapper[4776]: I0128 07:19:38.722487 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:19:38 crc kubenswrapper[4776]: I0128 07:19:38.722535 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 07:19:38 crc kubenswrapper[4776]: I0128 07:19:38.722582 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88f7f\" (UniqueName: \"kubernetes.io/projected/cb26e5c4-e4de-4bca-86c0-160dffb2bb73-kube-api-access-88f7f\") on node \"crc\" DevicePath \"\"" Jan 28 07:19:38 crc kubenswrapper[4776]: I0128 07:19:38.949459 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" event={"ID":"cb26e5c4-e4de-4bca-86c0-160dffb2bb73","Type":"ContainerDied","Data":"99be99ce5d65774a031f64ec0a7c69deb80007aafa27315ed62d1465d4af9b8a"} Jan 28 07:19:38 crc kubenswrapper[4776]: I0128 07:19:38.949500 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99be99ce5d65774a031f64ec0a7c69deb80007aafa27315ed62d1465d4af9b8a" Jan 28 07:19:38 crc kubenswrapper[4776]: I0128 07:19:38.949602 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.052265 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl"] Jan 28 07:19:39 crc kubenswrapper[4776]: E0128 07:19:39.052716 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb26e5c4-e4de-4bca-86c0-160dffb2bb73" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.052735 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb26e5c4-e4de-4bca-86c0-160dffb2bb73" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.052915 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb26e5c4-e4de-4bca-86c0-160dffb2bb73" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.053513 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.063978 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.064644 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.064728 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cl6qn" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.064753 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.074358 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl"] Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.230445 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fq4r\" (UniqueName: \"kubernetes.io/projected/a00fc8a0-f777-496b-80d1-2c6e116d6e00-kube-api-access-9fq4r\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-688gl\" (UID: \"a00fc8a0-f777-496b-80d1-2c6e116d6e00\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.230658 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a00fc8a0-f777-496b-80d1-2c6e116d6e00-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-688gl\" (UID: \"a00fc8a0-f777-496b-80d1-2c6e116d6e00\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.230709 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a00fc8a0-f777-496b-80d1-2c6e116d6e00-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-688gl\" (UID: \"a00fc8a0-f777-496b-80d1-2c6e116d6e00\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.332044 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a00fc8a0-f777-496b-80d1-2c6e116d6e00-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-688gl\" (UID: \"a00fc8a0-f777-496b-80d1-2c6e116d6e00\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.332675 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a00fc8a0-f777-496b-80d1-2c6e116d6e00-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-688gl\" (UID: \"a00fc8a0-f777-496b-80d1-2c6e116d6e00\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.332942 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fq4r\" (UniqueName: \"kubernetes.io/projected/a00fc8a0-f777-496b-80d1-2c6e116d6e00-kube-api-access-9fq4r\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-688gl\" (UID: \"a00fc8a0-f777-496b-80d1-2c6e116d6e00\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.339821 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a00fc8a0-f777-496b-80d1-2c6e116d6e00-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-688gl\" (UID: \"a00fc8a0-f777-496b-80d1-2c6e116d6e00\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.340153 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a00fc8a0-f777-496b-80d1-2c6e116d6e00-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-688gl\" (UID: \"a00fc8a0-f777-496b-80d1-2c6e116d6e00\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.366477 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fq4r\" (UniqueName: \"kubernetes.io/projected/a00fc8a0-f777-496b-80d1-2c6e116d6e00-kube-api-access-9fq4r\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-688gl\" (UID: \"a00fc8a0-f777-496b-80d1-2c6e116d6e00\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" Jan 28 07:19:39 crc kubenswrapper[4776]: I0128 07:19:39.382242 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" Jan 28 07:19:40 crc kubenswrapper[4776]: I0128 07:19:40.012957 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl"] Jan 28 07:19:40 crc kubenswrapper[4776]: I0128 07:19:40.971206 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" event={"ID":"a00fc8a0-f777-496b-80d1-2c6e116d6e00","Type":"ContainerStarted","Data":"1f9b08a0388d8f7f15ddbfc986c68600d4dbee799003a3e4884c72148cf5ebc6"} Jan 28 07:19:40 crc kubenswrapper[4776]: I0128 07:19:40.971527 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" event={"ID":"a00fc8a0-f777-496b-80d1-2c6e116d6e00","Type":"ContainerStarted","Data":"6cedf4b0cc5e5b2e37ca27c999a5840cc091e34d7ed692913fb0d5f42bec5f7f"} Jan 28 07:19:41 crc kubenswrapper[4776]: I0128 07:19:41.004460 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" podStartSLOduration=1.606172978 podStartE2EDuration="2.004441407s" podCreationTimestamp="2026-01-28 07:19:39 +0000 UTC" firstStartedPulling="2026-01-28 07:19:40.023228943 +0000 UTC m=+1751.438889103" lastFinishedPulling="2026-01-28 07:19:40.421497352 +0000 UTC m=+1751.837157532" observedRunningTime="2026-01-28 07:19:40.997994133 +0000 UTC m=+1752.413654303" watchObservedRunningTime="2026-01-28 07:19:41.004441407 +0000 UTC m=+1752.420101577" Jan 28 07:19:47 crc kubenswrapper[4776]: I0128 07:19:47.981736 4776 scope.go:117] "RemoveContainer" containerID="563abec8d38830b597d10b9c43b52d90dcbaa959229193230499831920913b27" Jan 28 07:19:48 crc kubenswrapper[4776]: I0128 07:19:48.011004 4776 scope.go:117] "RemoveContainer" containerID="76f3490238c1debb347ca950de1a442bd8b51a2d3664cc77098990367a729c10" Jan 28 07:19:48 crc kubenswrapper[4776]: I0128 07:19:48.070795 4776 scope.go:117] "RemoveContainer" containerID="a5021333bc8a17baae09ea0b334e2701df8bbf478ef05e433929c14053ec8de5" Jan 28 07:19:48 crc kubenswrapper[4776]: I0128 07:19:48.122148 4776 scope.go:117] "RemoveContainer" containerID="a76f39a347c923f93cb7e5af13b816e4676bef72ee179caf1cbdbedbb7eb8b24" Jan 28 07:19:48 crc kubenswrapper[4776]: I0128 07:19:48.167766 4776 scope.go:117] "RemoveContainer" containerID="d8f2e34a8dac1da7497496239cdfac06c222fef456fa7664b5899176a9259827" Jan 28 07:19:48 crc kubenswrapper[4776]: I0128 07:19:48.207758 4776 scope.go:117] "RemoveContainer" containerID="dedf562340912036619cca369ad682470e50dd8d4ea3311e00b8506d2395f06a" Jan 28 07:19:48 crc kubenswrapper[4776]: I0128 07:19:48.305101 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:19:48 crc kubenswrapper[4776]: E0128 07:19:48.305425 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:19:49 crc kubenswrapper[4776]: I0128 07:19:49.038428 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vpps7"] Jan 28 07:19:49 crc kubenswrapper[4776]: I0128 07:19:49.047919 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vpps7"] Jan 28 07:19:49 crc kubenswrapper[4776]: I0128 07:19:49.318261 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="745dbbf2-ccdc-4a8f-878b-9208eb693cc5" path="/var/lib/kubelet/pods/745dbbf2-ccdc-4a8f-878b-9208eb693cc5/volumes" Jan 28 07:19:59 crc kubenswrapper[4776]: I0128 07:19:59.311113 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:19:59 crc kubenswrapper[4776]: E0128 07:19:59.312169 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:20:04 crc kubenswrapper[4776]: I0128 07:20:04.582084 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p9dch"] Jan 28 07:20:04 crc kubenswrapper[4776]: I0128 07:20:04.584915 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p9dch" Jan 28 07:20:04 crc kubenswrapper[4776]: I0128 07:20:04.603610 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p9dch"] Jan 28 07:20:04 crc kubenswrapper[4776]: I0128 07:20:04.639762 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg6jb\" (UniqueName: \"kubernetes.io/projected/a9aafc6f-0de4-464e-a165-444a7bdf68c2-kube-api-access-qg6jb\") pod \"redhat-marketplace-p9dch\" (UID: \"a9aafc6f-0de4-464e-a165-444a7bdf68c2\") " pod="openshift-marketplace/redhat-marketplace-p9dch" Jan 28 07:20:04 crc kubenswrapper[4776]: I0128 07:20:04.640527 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9aafc6f-0de4-464e-a165-444a7bdf68c2-utilities\") pod \"redhat-marketplace-p9dch\" (UID: \"a9aafc6f-0de4-464e-a165-444a7bdf68c2\") " pod="openshift-marketplace/redhat-marketplace-p9dch" Jan 28 07:20:04 crc kubenswrapper[4776]: I0128 07:20:04.640807 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9aafc6f-0de4-464e-a165-444a7bdf68c2-catalog-content\") pod \"redhat-marketplace-p9dch\" (UID: \"a9aafc6f-0de4-464e-a165-444a7bdf68c2\") " pod="openshift-marketplace/redhat-marketplace-p9dch" Jan 28 07:20:04 crc kubenswrapper[4776]: I0128 07:20:04.743626 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg6jb\" (UniqueName: \"kubernetes.io/projected/a9aafc6f-0de4-464e-a165-444a7bdf68c2-kube-api-access-qg6jb\") pod \"redhat-marketplace-p9dch\" (UID: \"a9aafc6f-0de4-464e-a165-444a7bdf68c2\") " pod="openshift-marketplace/redhat-marketplace-p9dch" Jan 28 07:20:04 crc kubenswrapper[4776]: I0128 07:20:04.743769 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9aafc6f-0de4-464e-a165-444a7bdf68c2-utilities\") pod \"redhat-marketplace-p9dch\" (UID: \"a9aafc6f-0de4-464e-a165-444a7bdf68c2\") " pod="openshift-marketplace/redhat-marketplace-p9dch" Jan 28 07:20:04 crc kubenswrapper[4776]: I0128 07:20:04.744021 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9aafc6f-0de4-464e-a165-444a7bdf68c2-catalog-content\") pod \"redhat-marketplace-p9dch\" (UID: \"a9aafc6f-0de4-464e-a165-444a7bdf68c2\") " pod="openshift-marketplace/redhat-marketplace-p9dch" Jan 28 07:20:04 crc kubenswrapper[4776]: I0128 07:20:04.744514 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9aafc6f-0de4-464e-a165-444a7bdf68c2-catalog-content\") pod \"redhat-marketplace-p9dch\" (UID: \"a9aafc6f-0de4-464e-a165-444a7bdf68c2\") " pod="openshift-marketplace/redhat-marketplace-p9dch" Jan 28 07:20:04 crc kubenswrapper[4776]: I0128 07:20:04.744746 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9aafc6f-0de4-464e-a165-444a7bdf68c2-utilities\") pod \"redhat-marketplace-p9dch\" (UID: \"a9aafc6f-0de4-464e-a165-444a7bdf68c2\") " pod="openshift-marketplace/redhat-marketplace-p9dch" Jan 28 07:20:04 crc kubenswrapper[4776]: I0128 07:20:04.763803 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg6jb\" (UniqueName: \"kubernetes.io/projected/a9aafc6f-0de4-464e-a165-444a7bdf68c2-kube-api-access-qg6jb\") pod \"redhat-marketplace-p9dch\" (UID: \"a9aafc6f-0de4-464e-a165-444a7bdf68c2\") " pod="openshift-marketplace/redhat-marketplace-p9dch" Jan 28 07:20:04 crc kubenswrapper[4776]: I0128 07:20:04.931597 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p9dch" Jan 28 07:20:05 crc kubenswrapper[4776]: I0128 07:20:05.367188 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p9dch"] Jan 28 07:20:06 crc kubenswrapper[4776]: I0128 07:20:06.225863 4776 generic.go:334] "Generic (PLEG): container finished" podID="a9aafc6f-0de4-464e-a165-444a7bdf68c2" containerID="75aa278523c85c595ccc86ea3acfc3b516c275aebe7e216ffe1cc1f8452d38f7" exitCode=0 Jan 28 07:20:06 crc kubenswrapper[4776]: I0128 07:20:06.225975 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9dch" event={"ID":"a9aafc6f-0de4-464e-a165-444a7bdf68c2","Type":"ContainerDied","Data":"75aa278523c85c595ccc86ea3acfc3b516c275aebe7e216ffe1cc1f8452d38f7"} Jan 28 07:20:06 crc kubenswrapper[4776]: I0128 07:20:06.226221 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9dch" event={"ID":"a9aafc6f-0de4-464e-a165-444a7bdf68c2","Type":"ContainerStarted","Data":"f4cd191dde72e3d0f6e807f1e276cf33fafacf11262febd4f4b6fbfa3c1b5717"} Jan 28 07:20:07 crc kubenswrapper[4776]: I0128 07:20:07.239727 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9dch" event={"ID":"a9aafc6f-0de4-464e-a165-444a7bdf68c2","Type":"ContainerStarted","Data":"79c98b863805fd3ce04bc2e7e2002e509c32418540515481e39691799aa3fb30"} Jan 28 07:20:08 crc kubenswrapper[4776]: I0128 07:20:08.056731 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ww8wt"] Jan 28 07:20:08 crc kubenswrapper[4776]: I0128 07:20:08.066457 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-7qvb2"] Jan 28 07:20:08 crc kubenswrapper[4776]: I0128 07:20:08.077048 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-7qvb2"] Jan 28 07:20:08 crc kubenswrapper[4776]: I0128 07:20:08.085640 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ww8wt"] Jan 28 07:20:08 crc kubenswrapper[4776]: I0128 07:20:08.252804 4776 generic.go:334] "Generic (PLEG): container finished" podID="a9aafc6f-0de4-464e-a165-444a7bdf68c2" containerID="79c98b863805fd3ce04bc2e7e2002e509c32418540515481e39691799aa3fb30" exitCode=0 Jan 28 07:20:08 crc kubenswrapper[4776]: I0128 07:20:08.252862 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9dch" event={"ID":"a9aafc6f-0de4-464e-a165-444a7bdf68c2","Type":"ContainerDied","Data":"79c98b863805fd3ce04bc2e7e2002e509c32418540515481e39691799aa3fb30"} Jan 28 07:20:09 crc kubenswrapper[4776]: I0128 07:20:09.263743 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9dch" event={"ID":"a9aafc6f-0de4-464e-a165-444a7bdf68c2","Type":"ContainerStarted","Data":"2a0714fd40c098b0bccc619800744130f9f5908b1f5d69413467f50473de197c"} Jan 28 07:20:09 crc kubenswrapper[4776]: I0128 07:20:09.293401 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p9dch" podStartSLOduration=2.672080593 podStartE2EDuration="5.293380872s" podCreationTimestamp="2026-01-28 07:20:04 +0000 UTC" firstStartedPulling="2026-01-28 07:20:06.227500135 +0000 UTC m=+1777.643160295" lastFinishedPulling="2026-01-28 07:20:08.848800414 +0000 UTC m=+1780.264460574" observedRunningTime="2026-01-28 07:20:09.284854581 +0000 UTC m=+1780.700514741" watchObservedRunningTime="2026-01-28 07:20:09.293380872 +0000 UTC m=+1780.709041032" Jan 28 07:20:09 crc kubenswrapper[4776]: I0128 07:20:09.322711 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b0d9dc-05c6-417b-adaa-85d82bf95aeb" path="/var/lib/kubelet/pods/21b0d9dc-05c6-417b-adaa-85d82bf95aeb/volumes" Jan 28 07:20:09 crc kubenswrapper[4776]: I0128 07:20:09.323709 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf66e818-91d4-4a50-b10e-b40f0dc8754d" path="/var/lib/kubelet/pods/bf66e818-91d4-4a50-b10e-b40f0dc8754d/volumes" Jan 28 07:20:12 crc kubenswrapper[4776]: I0128 07:20:12.305601 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:20:12 crc kubenswrapper[4776]: E0128 07:20:12.306626 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:20:14 crc kubenswrapper[4776]: I0128 07:20:14.931982 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p9dch" Jan 28 07:20:14 crc kubenswrapper[4776]: I0128 07:20:14.932625 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p9dch" Jan 28 07:20:14 crc kubenswrapper[4776]: I0128 07:20:14.981308 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p9dch" Jan 28 07:20:15 crc kubenswrapper[4776]: I0128 07:20:15.389108 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p9dch" Jan 28 07:20:15 crc kubenswrapper[4776]: I0128 07:20:15.436483 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p9dch"] Jan 28 07:20:17 crc kubenswrapper[4776]: I0128 07:20:17.359823 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p9dch" podUID="a9aafc6f-0de4-464e-a165-444a7bdf68c2" containerName="registry-server" containerID="cri-o://2a0714fd40c098b0bccc619800744130f9f5908b1f5d69413467f50473de197c" gracePeriod=2 Jan 28 07:20:17 crc kubenswrapper[4776]: I0128 07:20:17.788677 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p9dch" Jan 28 07:20:17 crc kubenswrapper[4776]: I0128 07:20:17.823532 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg6jb\" (UniqueName: \"kubernetes.io/projected/a9aafc6f-0de4-464e-a165-444a7bdf68c2-kube-api-access-qg6jb\") pod \"a9aafc6f-0de4-464e-a165-444a7bdf68c2\" (UID: \"a9aafc6f-0de4-464e-a165-444a7bdf68c2\") " Jan 28 07:20:17 crc kubenswrapper[4776]: I0128 07:20:17.823744 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9aafc6f-0de4-464e-a165-444a7bdf68c2-utilities\") pod \"a9aafc6f-0de4-464e-a165-444a7bdf68c2\" (UID: \"a9aafc6f-0de4-464e-a165-444a7bdf68c2\") " Jan 28 07:20:17 crc kubenswrapper[4776]: I0128 07:20:17.823922 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9aafc6f-0de4-464e-a165-444a7bdf68c2-catalog-content\") pod \"a9aafc6f-0de4-464e-a165-444a7bdf68c2\" (UID: \"a9aafc6f-0de4-464e-a165-444a7bdf68c2\") " Jan 28 07:20:17 crc kubenswrapper[4776]: I0128 07:20:17.824810 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9aafc6f-0de4-464e-a165-444a7bdf68c2-utilities" (OuterVolumeSpecName: "utilities") pod "a9aafc6f-0de4-464e-a165-444a7bdf68c2" (UID: "a9aafc6f-0de4-464e-a165-444a7bdf68c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:20:17 crc kubenswrapper[4776]: I0128 07:20:17.830861 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9aafc6f-0de4-464e-a165-444a7bdf68c2-kube-api-access-qg6jb" (OuterVolumeSpecName: "kube-api-access-qg6jb") pod "a9aafc6f-0de4-464e-a165-444a7bdf68c2" (UID: "a9aafc6f-0de4-464e-a165-444a7bdf68c2"). InnerVolumeSpecName "kube-api-access-qg6jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:20:17 crc kubenswrapper[4776]: I0128 07:20:17.926209 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg6jb\" (UniqueName: \"kubernetes.io/projected/a9aafc6f-0de4-464e-a165-444a7bdf68c2-kube-api-access-qg6jb\") on node \"crc\" DevicePath \"\"" Jan 28 07:20:17 crc kubenswrapper[4776]: I0128 07:20:17.926246 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9aafc6f-0de4-464e-a165-444a7bdf68c2-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:20:18 crc kubenswrapper[4776]: I0128 07:20:18.019233 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9aafc6f-0de4-464e-a165-444a7bdf68c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9aafc6f-0de4-464e-a165-444a7bdf68c2" (UID: "a9aafc6f-0de4-464e-a165-444a7bdf68c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:20:18 crc kubenswrapper[4776]: I0128 07:20:18.030032 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9aafc6f-0de4-464e-a165-444a7bdf68c2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:20:18 crc kubenswrapper[4776]: I0128 07:20:18.371650 4776 generic.go:334] "Generic (PLEG): container finished" podID="a9aafc6f-0de4-464e-a165-444a7bdf68c2" containerID="2a0714fd40c098b0bccc619800744130f9f5908b1f5d69413467f50473de197c" exitCode=0 Jan 28 07:20:18 crc kubenswrapper[4776]: I0128 07:20:18.371691 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9dch" event={"ID":"a9aafc6f-0de4-464e-a165-444a7bdf68c2","Type":"ContainerDied","Data":"2a0714fd40c098b0bccc619800744130f9f5908b1f5d69413467f50473de197c"} Jan 28 07:20:18 crc kubenswrapper[4776]: I0128 07:20:18.371717 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9dch" event={"ID":"a9aafc6f-0de4-464e-a165-444a7bdf68c2","Type":"ContainerDied","Data":"f4cd191dde72e3d0f6e807f1e276cf33fafacf11262febd4f4b6fbfa3c1b5717"} Jan 28 07:20:18 crc kubenswrapper[4776]: I0128 07:20:18.371733 4776 scope.go:117] "RemoveContainer" containerID="2a0714fd40c098b0bccc619800744130f9f5908b1f5d69413467f50473de197c" Jan 28 07:20:18 crc kubenswrapper[4776]: I0128 07:20:18.371743 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p9dch" Jan 28 07:20:18 crc kubenswrapper[4776]: I0128 07:20:18.394092 4776 scope.go:117] "RemoveContainer" containerID="79c98b863805fd3ce04bc2e7e2002e509c32418540515481e39691799aa3fb30" Jan 28 07:20:18 crc kubenswrapper[4776]: I0128 07:20:18.416144 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p9dch"] Jan 28 07:20:18 crc kubenswrapper[4776]: I0128 07:20:18.418222 4776 scope.go:117] "RemoveContainer" containerID="75aa278523c85c595ccc86ea3acfc3b516c275aebe7e216ffe1cc1f8452d38f7" Jan 28 07:20:18 crc kubenswrapper[4776]: I0128 07:20:18.426376 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p9dch"] Jan 28 07:20:18 crc kubenswrapper[4776]: I0128 07:20:18.466772 4776 scope.go:117] "RemoveContainer" containerID="2a0714fd40c098b0bccc619800744130f9f5908b1f5d69413467f50473de197c" Jan 28 07:20:18 crc kubenswrapper[4776]: E0128 07:20:18.467137 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a0714fd40c098b0bccc619800744130f9f5908b1f5d69413467f50473de197c\": container with ID starting with 2a0714fd40c098b0bccc619800744130f9f5908b1f5d69413467f50473de197c not found: ID does not exist" containerID="2a0714fd40c098b0bccc619800744130f9f5908b1f5d69413467f50473de197c" Jan 28 07:20:18 crc kubenswrapper[4776]: I0128 07:20:18.467167 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a0714fd40c098b0bccc619800744130f9f5908b1f5d69413467f50473de197c"} err="failed to get container status \"2a0714fd40c098b0bccc619800744130f9f5908b1f5d69413467f50473de197c\": rpc error: code = NotFound desc = could not find container \"2a0714fd40c098b0bccc619800744130f9f5908b1f5d69413467f50473de197c\": container with ID starting with 2a0714fd40c098b0bccc619800744130f9f5908b1f5d69413467f50473de197c not found: ID does not exist" Jan 28 07:20:18 crc kubenswrapper[4776]: I0128 07:20:18.467186 4776 scope.go:117] "RemoveContainer" containerID="79c98b863805fd3ce04bc2e7e2002e509c32418540515481e39691799aa3fb30" Jan 28 07:20:18 crc kubenswrapper[4776]: E0128 07:20:18.467392 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79c98b863805fd3ce04bc2e7e2002e509c32418540515481e39691799aa3fb30\": container with ID starting with 79c98b863805fd3ce04bc2e7e2002e509c32418540515481e39691799aa3fb30 not found: ID does not exist" containerID="79c98b863805fd3ce04bc2e7e2002e509c32418540515481e39691799aa3fb30" Jan 28 07:20:18 crc kubenswrapper[4776]: I0128 07:20:18.467410 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79c98b863805fd3ce04bc2e7e2002e509c32418540515481e39691799aa3fb30"} err="failed to get container status \"79c98b863805fd3ce04bc2e7e2002e509c32418540515481e39691799aa3fb30\": rpc error: code = NotFound desc = could not find container \"79c98b863805fd3ce04bc2e7e2002e509c32418540515481e39691799aa3fb30\": container with ID starting with 79c98b863805fd3ce04bc2e7e2002e509c32418540515481e39691799aa3fb30 not found: ID does not exist" Jan 28 07:20:18 crc kubenswrapper[4776]: I0128 07:20:18.467421 4776 scope.go:117] "RemoveContainer" containerID="75aa278523c85c595ccc86ea3acfc3b516c275aebe7e216ffe1cc1f8452d38f7" Jan 28 07:20:18 crc kubenswrapper[4776]: E0128 07:20:18.467618 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75aa278523c85c595ccc86ea3acfc3b516c275aebe7e216ffe1cc1f8452d38f7\": container with ID starting with 75aa278523c85c595ccc86ea3acfc3b516c275aebe7e216ffe1cc1f8452d38f7 not found: ID does not exist" containerID="75aa278523c85c595ccc86ea3acfc3b516c275aebe7e216ffe1cc1f8452d38f7" Jan 28 07:20:18 crc kubenswrapper[4776]: I0128 07:20:18.467637 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75aa278523c85c595ccc86ea3acfc3b516c275aebe7e216ffe1cc1f8452d38f7"} err="failed to get container status \"75aa278523c85c595ccc86ea3acfc3b516c275aebe7e216ffe1cc1f8452d38f7\": rpc error: code = NotFound desc = could not find container \"75aa278523c85c595ccc86ea3acfc3b516c275aebe7e216ffe1cc1f8452d38f7\": container with ID starting with 75aa278523c85c595ccc86ea3acfc3b516c275aebe7e216ffe1cc1f8452d38f7 not found: ID does not exist" Jan 28 07:20:19 crc kubenswrapper[4776]: I0128 07:20:19.314881 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9aafc6f-0de4-464e-a165-444a7bdf68c2" path="/var/lib/kubelet/pods/a9aafc6f-0de4-464e-a165-444a7bdf68c2/volumes" Jan 28 07:20:25 crc kubenswrapper[4776]: I0128 07:20:25.307084 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:20:25 crc kubenswrapper[4776]: E0128 07:20:25.308526 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:20:26 crc kubenswrapper[4776]: I0128 07:20:26.456594 4776 generic.go:334] "Generic (PLEG): container finished" podID="a00fc8a0-f777-496b-80d1-2c6e116d6e00" containerID="1f9b08a0388d8f7f15ddbfc986c68600d4dbee799003a3e4884c72148cf5ebc6" exitCode=0 Jan 28 07:20:26 crc kubenswrapper[4776]: I0128 07:20:26.456650 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" event={"ID":"a00fc8a0-f777-496b-80d1-2c6e116d6e00","Type":"ContainerDied","Data":"1f9b08a0388d8f7f15ddbfc986c68600d4dbee799003a3e4884c72148cf5ebc6"} Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:27.999709 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.081076 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fq4r\" (UniqueName: \"kubernetes.io/projected/a00fc8a0-f777-496b-80d1-2c6e116d6e00-kube-api-access-9fq4r\") pod \"a00fc8a0-f777-496b-80d1-2c6e116d6e00\" (UID: \"a00fc8a0-f777-496b-80d1-2c6e116d6e00\") " Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.081409 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a00fc8a0-f777-496b-80d1-2c6e116d6e00-inventory\") pod \"a00fc8a0-f777-496b-80d1-2c6e116d6e00\" (UID: \"a00fc8a0-f777-496b-80d1-2c6e116d6e00\") " Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.081511 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a00fc8a0-f777-496b-80d1-2c6e116d6e00-ssh-key-openstack-edpm-ipam\") pod \"a00fc8a0-f777-496b-80d1-2c6e116d6e00\" (UID: \"a00fc8a0-f777-496b-80d1-2c6e116d6e00\") " Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.133894 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a00fc8a0-f777-496b-80d1-2c6e116d6e00-kube-api-access-9fq4r" (OuterVolumeSpecName: "kube-api-access-9fq4r") pod "a00fc8a0-f777-496b-80d1-2c6e116d6e00" (UID: "a00fc8a0-f777-496b-80d1-2c6e116d6e00"). InnerVolumeSpecName "kube-api-access-9fq4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.169583 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a00fc8a0-f777-496b-80d1-2c6e116d6e00-inventory" (OuterVolumeSpecName: "inventory") pod "a00fc8a0-f777-496b-80d1-2c6e116d6e00" (UID: "a00fc8a0-f777-496b-80d1-2c6e116d6e00"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.175760 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a00fc8a0-f777-496b-80d1-2c6e116d6e00-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a00fc8a0-f777-496b-80d1-2c6e116d6e00" (UID: "a00fc8a0-f777-496b-80d1-2c6e116d6e00"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.184718 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fq4r\" (UniqueName: \"kubernetes.io/projected/a00fc8a0-f777-496b-80d1-2c6e116d6e00-kube-api-access-9fq4r\") on node \"crc\" DevicePath \"\"" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.184745 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a00fc8a0-f777-496b-80d1-2c6e116d6e00-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.184760 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a00fc8a0-f777-496b-80d1-2c6e116d6e00-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.481995 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" event={"ID":"a00fc8a0-f777-496b-80d1-2c6e116d6e00","Type":"ContainerDied","Data":"6cedf4b0cc5e5b2e37ca27c999a5840cc091e34d7ed692913fb0d5f42bec5f7f"} Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.482039 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cedf4b0cc5e5b2e37ca27c999a5840cc091e34d7ed692913fb0d5f42bec5f7f" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.482083 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-688gl" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.587926 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt"] Jan 28 07:20:28 crc kubenswrapper[4776]: E0128 07:20:28.589060 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a00fc8a0-f777-496b-80d1-2c6e116d6e00" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.589086 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a00fc8a0-f777-496b-80d1-2c6e116d6e00" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 28 07:20:28 crc kubenswrapper[4776]: E0128 07:20:28.589098 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9aafc6f-0de4-464e-a165-444a7bdf68c2" containerName="extract-content" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.589106 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9aafc6f-0de4-464e-a165-444a7bdf68c2" containerName="extract-content" Jan 28 07:20:28 crc kubenswrapper[4776]: E0128 07:20:28.589158 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9aafc6f-0de4-464e-a165-444a7bdf68c2" containerName="extract-utilities" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.589167 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9aafc6f-0de4-464e-a165-444a7bdf68c2" containerName="extract-utilities" Jan 28 07:20:28 crc kubenswrapper[4776]: E0128 07:20:28.589191 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9aafc6f-0de4-464e-a165-444a7bdf68c2" containerName="registry-server" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.589199 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9aafc6f-0de4-464e-a165-444a7bdf68c2" containerName="registry-server" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.590456 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a00fc8a0-f777-496b-80d1-2c6e116d6e00" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.590489 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9aafc6f-0de4-464e-a165-444a7bdf68c2" containerName="registry-server" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.602334 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.607319 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.607603 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.607767 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cl6qn" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.608147 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.628966 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt"] Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.699330 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqz4k\" (UniqueName: \"kubernetes.io/projected/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-kube-api-access-cqz4k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt\" (UID: \"e525b968-aa0e-4d5a-9fe4-063ce4fdb686\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.699464 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt\" (UID: \"e525b968-aa0e-4d5a-9fe4-063ce4fdb686\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.699501 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt\" (UID: \"e525b968-aa0e-4d5a-9fe4-063ce4fdb686\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.801797 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqz4k\" (UniqueName: \"kubernetes.io/projected/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-kube-api-access-cqz4k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt\" (UID: \"e525b968-aa0e-4d5a-9fe4-063ce4fdb686\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.802024 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt\" (UID: \"e525b968-aa0e-4d5a-9fe4-063ce4fdb686\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.802087 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt\" (UID: \"e525b968-aa0e-4d5a-9fe4-063ce4fdb686\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.809741 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt\" (UID: \"e525b968-aa0e-4d5a-9fe4-063ce4fdb686\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.809741 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt\" (UID: \"e525b968-aa0e-4d5a-9fe4-063ce4fdb686\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.821657 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqz4k\" (UniqueName: \"kubernetes.io/projected/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-kube-api-access-cqz4k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt\" (UID: \"e525b968-aa0e-4d5a-9fe4-063ce4fdb686\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" Jan 28 07:20:28 crc kubenswrapper[4776]: I0128 07:20:28.935296 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" Jan 28 07:20:29 crc kubenswrapper[4776]: I0128 07:20:29.533711 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt"] Jan 28 07:20:29 crc kubenswrapper[4776]: I0128 07:20:29.952029 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:20:30 crc kubenswrapper[4776]: I0128 07:20:30.501913 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" event={"ID":"e525b968-aa0e-4d5a-9fe4-063ce4fdb686","Type":"ContainerStarted","Data":"5aa4cd98fe5863d7b201772c162cf2b18a3b3cca59f03905ec0b552d1ced5843"} Jan 28 07:20:30 crc kubenswrapper[4776]: I0128 07:20:30.502230 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" event={"ID":"e525b968-aa0e-4d5a-9fe4-063ce4fdb686","Type":"ContainerStarted","Data":"cecbe0f12da0b8f9b48b8ec6bc2543f95f64d85b9685792f7f612383e25f3248"} Jan 28 07:20:30 crc kubenswrapper[4776]: I0128 07:20:30.526947 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" podStartSLOduration=2.132599477 podStartE2EDuration="2.526925881s" podCreationTimestamp="2026-01-28 07:20:28 +0000 UTC" firstStartedPulling="2026-01-28 07:20:29.554356169 +0000 UTC m=+1800.970016319" lastFinishedPulling="2026-01-28 07:20:29.948682563 +0000 UTC m=+1801.364342723" observedRunningTime="2026-01-28 07:20:30.516784248 +0000 UTC m=+1801.932444418" watchObservedRunningTime="2026-01-28 07:20:30.526925881 +0000 UTC m=+1801.942586051" Jan 28 07:20:39 crc kubenswrapper[4776]: I0128 07:20:39.304854 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:20:39 crc kubenswrapper[4776]: E0128 07:20:39.305658 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:20:48 crc kubenswrapper[4776]: I0128 07:20:48.369261 4776 scope.go:117] "RemoveContainer" containerID="dd68b836b756665baf0195ac99aaa3369ce42363de65b984002275c975ab30e1" Jan 28 07:20:48 crc kubenswrapper[4776]: I0128 07:20:48.430312 4776 scope.go:117] "RemoveContainer" containerID="72dd7aae6ab26716dba4392a2c43894b15d54a4848029158565f5e0c93aa7081" Jan 28 07:20:48 crc kubenswrapper[4776]: I0128 07:20:48.487055 4776 scope.go:117] "RemoveContainer" containerID="22bc85da721c95a33e45c535ae41c5000cec07383fd28065763096bad879272d" Jan 28 07:20:50 crc kubenswrapper[4776]: I0128 07:20:50.305081 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:20:50 crc kubenswrapper[4776]: E0128 07:20:50.305488 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:20:53 crc kubenswrapper[4776]: I0128 07:20:53.072331 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-t2477"] Jan 28 07:20:53 crc kubenswrapper[4776]: I0128 07:20:53.088678 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-t2477"] Jan 28 07:20:53 crc kubenswrapper[4776]: I0128 07:20:53.320354 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c9c2ccb-51d7-4307-8536-11657989c02d" path="/var/lib/kubelet/pods/2c9c2ccb-51d7-4307-8536-11657989c02d/volumes" Jan 28 07:21:03 crc kubenswrapper[4776]: I0128 07:21:03.306627 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:21:03 crc kubenswrapper[4776]: E0128 07:21:03.308470 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:21:14 crc kubenswrapper[4776]: I0128 07:21:14.305665 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:21:14 crc kubenswrapper[4776]: E0128 07:21:14.306489 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:21:27 crc kubenswrapper[4776]: I0128 07:21:27.304968 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:21:27 crc kubenswrapper[4776]: E0128 07:21:27.305636 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:21:30 crc kubenswrapper[4776]: I0128 07:21:30.233736 4776 generic.go:334] "Generic (PLEG): container finished" podID="e525b968-aa0e-4d5a-9fe4-063ce4fdb686" containerID="5aa4cd98fe5863d7b201772c162cf2b18a3b3cca59f03905ec0b552d1ced5843" exitCode=0 Jan 28 07:21:30 crc kubenswrapper[4776]: I0128 07:21:30.233863 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" event={"ID":"e525b968-aa0e-4d5a-9fe4-063ce4fdb686","Type":"ContainerDied","Data":"5aa4cd98fe5863d7b201772c162cf2b18a3b3cca59f03905ec0b552d1ced5843"} Jan 28 07:21:31 crc kubenswrapper[4776]: I0128 07:21:31.688260 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" Jan 28 07:21:31 crc kubenswrapper[4776]: I0128 07:21:31.753162 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-inventory\") pod \"e525b968-aa0e-4d5a-9fe4-063ce4fdb686\" (UID: \"e525b968-aa0e-4d5a-9fe4-063ce4fdb686\") " Jan 28 07:21:31 crc kubenswrapper[4776]: I0128 07:21:31.753820 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-ssh-key-openstack-edpm-ipam\") pod \"e525b968-aa0e-4d5a-9fe4-063ce4fdb686\" (UID: \"e525b968-aa0e-4d5a-9fe4-063ce4fdb686\") " Jan 28 07:21:31 crc kubenswrapper[4776]: I0128 07:21:31.754024 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqz4k\" (UniqueName: \"kubernetes.io/projected/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-kube-api-access-cqz4k\") pod \"e525b968-aa0e-4d5a-9fe4-063ce4fdb686\" (UID: \"e525b968-aa0e-4d5a-9fe4-063ce4fdb686\") " Jan 28 07:21:31 crc kubenswrapper[4776]: I0128 07:21:31.762172 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-kube-api-access-cqz4k" (OuterVolumeSpecName: "kube-api-access-cqz4k") pod "e525b968-aa0e-4d5a-9fe4-063ce4fdb686" (UID: "e525b968-aa0e-4d5a-9fe4-063ce4fdb686"). InnerVolumeSpecName "kube-api-access-cqz4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:21:31 crc kubenswrapper[4776]: I0128 07:21:31.784839 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e525b968-aa0e-4d5a-9fe4-063ce4fdb686" (UID: "e525b968-aa0e-4d5a-9fe4-063ce4fdb686"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:21:31 crc kubenswrapper[4776]: I0128 07:21:31.786831 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-inventory" (OuterVolumeSpecName: "inventory") pod "e525b968-aa0e-4d5a-9fe4-063ce4fdb686" (UID: "e525b968-aa0e-4d5a-9fe4-063ce4fdb686"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:21:31 crc kubenswrapper[4776]: I0128 07:21:31.856793 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:21:31 crc kubenswrapper[4776]: I0128 07:21:31.856836 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqz4k\" (UniqueName: \"kubernetes.io/projected/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-kube-api-access-cqz4k\") on node \"crc\" DevicePath \"\"" Jan 28 07:21:31 crc kubenswrapper[4776]: I0128 07:21:31.856851 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e525b968-aa0e-4d5a-9fe4-063ce4fdb686-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.263464 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" event={"ID":"e525b968-aa0e-4d5a-9fe4-063ce4fdb686","Type":"ContainerDied","Data":"cecbe0f12da0b8f9b48b8ec6bc2543f95f64d85b9685792f7f612383e25f3248"} Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.263527 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cecbe0f12da0b8f9b48b8ec6bc2543f95f64d85b9685792f7f612383e25f3248" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.263625 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.369333 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vb66j"] Jan 28 07:21:32 crc kubenswrapper[4776]: E0128 07:21:32.370265 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e525b968-aa0e-4d5a-9fe4-063ce4fdb686" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.370285 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e525b968-aa0e-4d5a-9fe4-063ce4fdb686" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.370459 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e525b968-aa0e-4d5a-9fe4-063ce4fdb686" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.371315 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.373468 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.374193 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.374407 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.374483 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cl6qn" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.408766 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vb66j"] Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.473795 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmcnb\" (UniqueName: \"kubernetes.io/projected/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-kube-api-access-vmcnb\") pod \"ssh-known-hosts-edpm-deployment-vb66j\" (UID: \"f2d64cd0-fb47-4169-88d3-dec3bb7591b0\") " pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.473875 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vb66j\" (UID: \"f2d64cd0-fb47-4169-88d3-dec3bb7591b0\") " pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.473925 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vb66j\" (UID: \"f2d64cd0-fb47-4169-88d3-dec3bb7591b0\") " pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.575371 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vb66j\" (UID: \"f2d64cd0-fb47-4169-88d3-dec3bb7591b0\") " pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.575450 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vb66j\" (UID: \"f2d64cd0-fb47-4169-88d3-dec3bb7591b0\") " pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.576594 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmcnb\" (UniqueName: \"kubernetes.io/projected/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-kube-api-access-vmcnb\") pod \"ssh-known-hosts-edpm-deployment-vb66j\" (UID: \"f2d64cd0-fb47-4169-88d3-dec3bb7591b0\") " pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.581630 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vb66j\" (UID: \"f2d64cd0-fb47-4169-88d3-dec3bb7591b0\") " pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.588414 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vb66j\" (UID: \"f2d64cd0-fb47-4169-88d3-dec3bb7591b0\") " pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.600027 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmcnb\" (UniqueName: \"kubernetes.io/projected/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-kube-api-access-vmcnb\") pod \"ssh-known-hosts-edpm-deployment-vb66j\" (UID: \"f2d64cd0-fb47-4169-88d3-dec3bb7591b0\") " pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" Jan 28 07:21:32 crc kubenswrapper[4776]: I0128 07:21:32.691917 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" Jan 28 07:21:33 crc kubenswrapper[4776]: I0128 07:21:33.213907 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 07:21:33 crc kubenswrapper[4776]: I0128 07:21:33.219834 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vb66j"] Jan 28 07:21:33 crc kubenswrapper[4776]: I0128 07:21:33.274109 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" event={"ID":"f2d64cd0-fb47-4169-88d3-dec3bb7591b0","Type":"ContainerStarted","Data":"c1ec9f99d6492cc8a5fc39ec09d24759dd77a640440a99d0ad0627f8522fa30d"} Jan 28 07:21:34 crc kubenswrapper[4776]: I0128 07:21:34.284634 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" event={"ID":"f2d64cd0-fb47-4169-88d3-dec3bb7591b0","Type":"ContainerStarted","Data":"aceeb615b9f3d6b0333b7f9f048be4051dd3b280fb5ebe30070e0c2030c38666"} Jan 28 07:21:34 crc kubenswrapper[4776]: I0128 07:21:34.316739 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" podStartSLOduration=1.842123234 podStartE2EDuration="2.316717045s" podCreationTimestamp="2026-01-28 07:21:32 +0000 UTC" firstStartedPulling="2026-01-28 07:21:33.21372735 +0000 UTC m=+1864.629387510" lastFinishedPulling="2026-01-28 07:21:33.688321151 +0000 UTC m=+1865.103981321" observedRunningTime="2026-01-28 07:21:34.304250832 +0000 UTC m=+1865.719911002" watchObservedRunningTime="2026-01-28 07:21:34.316717045 +0000 UTC m=+1865.732377215" Jan 28 07:21:41 crc kubenswrapper[4776]: I0128 07:21:41.306019 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:21:41 crc kubenswrapper[4776]: E0128 07:21:41.307226 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:21:42 crc kubenswrapper[4776]: I0128 07:21:42.390630 4776 generic.go:334] "Generic (PLEG): container finished" podID="f2d64cd0-fb47-4169-88d3-dec3bb7591b0" containerID="aceeb615b9f3d6b0333b7f9f048be4051dd3b280fb5ebe30070e0c2030c38666" exitCode=0 Jan 28 07:21:42 crc kubenswrapper[4776]: I0128 07:21:42.390705 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" event={"ID":"f2d64cd0-fb47-4169-88d3-dec3bb7591b0","Type":"ContainerDied","Data":"aceeb615b9f3d6b0333b7f9f048be4051dd3b280fb5ebe30070e0c2030c38666"} Jan 28 07:21:43 crc kubenswrapper[4776]: I0128 07:21:43.850669 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" Jan 28 07:21:43 crc kubenswrapper[4776]: I0128 07:21:43.899667 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-ssh-key-openstack-edpm-ipam\") pod \"f2d64cd0-fb47-4169-88d3-dec3bb7591b0\" (UID: \"f2d64cd0-fb47-4169-88d3-dec3bb7591b0\") " Jan 28 07:21:43 crc kubenswrapper[4776]: I0128 07:21:43.899769 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmcnb\" (UniqueName: \"kubernetes.io/projected/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-kube-api-access-vmcnb\") pod \"f2d64cd0-fb47-4169-88d3-dec3bb7591b0\" (UID: \"f2d64cd0-fb47-4169-88d3-dec3bb7591b0\") " Jan 28 07:21:43 crc kubenswrapper[4776]: I0128 07:21:43.899856 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-inventory-0\") pod \"f2d64cd0-fb47-4169-88d3-dec3bb7591b0\" (UID: \"f2d64cd0-fb47-4169-88d3-dec3bb7591b0\") " Jan 28 07:21:43 crc kubenswrapper[4776]: I0128 07:21:43.906796 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-kube-api-access-vmcnb" (OuterVolumeSpecName: "kube-api-access-vmcnb") pod "f2d64cd0-fb47-4169-88d3-dec3bb7591b0" (UID: "f2d64cd0-fb47-4169-88d3-dec3bb7591b0"). InnerVolumeSpecName "kube-api-access-vmcnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:21:43 crc kubenswrapper[4776]: I0128 07:21:43.935770 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "f2d64cd0-fb47-4169-88d3-dec3bb7591b0" (UID: "f2d64cd0-fb47-4169-88d3-dec3bb7591b0"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:21:43 crc kubenswrapper[4776]: I0128 07:21:43.951730 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f2d64cd0-fb47-4169-88d3-dec3bb7591b0" (UID: "f2d64cd0-fb47-4169-88d3-dec3bb7591b0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.003863 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.003926 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmcnb\" (UniqueName: \"kubernetes.io/projected/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-kube-api-access-vmcnb\") on node \"crc\" DevicePath \"\"" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.003950 4776 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f2d64cd0-fb47-4169-88d3-dec3bb7591b0-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.416967 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" event={"ID":"f2d64cd0-fb47-4169-88d3-dec3bb7591b0","Type":"ContainerDied","Data":"c1ec9f99d6492cc8a5fc39ec09d24759dd77a640440a99d0ad0627f8522fa30d"} Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.417343 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1ec9f99d6492cc8a5fc39ec09d24759dd77a640440a99d0ad0627f8522fa30d" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.417036 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vb66j" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.504279 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d"] Jan 28 07:21:44 crc kubenswrapper[4776]: E0128 07:21:44.504751 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2d64cd0-fb47-4169-88d3-dec3bb7591b0" containerName="ssh-known-hosts-edpm-deployment" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.504772 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d64cd0-fb47-4169-88d3-dec3bb7591b0" containerName="ssh-known-hosts-edpm-deployment" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.505066 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2d64cd0-fb47-4169-88d3-dec3bb7591b0" containerName="ssh-known-hosts-edpm-deployment" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.506035 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.514684 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.515110 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.515998 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.516374 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cl6qn" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.559788 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d"] Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.618292 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfgpl\" (UniqueName: \"kubernetes.io/projected/fa499a35-59bb-4ee1-93b4-98ab890c2126-kube-api-access-xfgpl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zcz2d\" (UID: \"fa499a35-59bb-4ee1-93b4-98ab890c2126\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.618370 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa499a35-59bb-4ee1-93b4-98ab890c2126-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zcz2d\" (UID: \"fa499a35-59bb-4ee1-93b4-98ab890c2126\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.618712 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fa499a35-59bb-4ee1-93b4-98ab890c2126-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zcz2d\" (UID: \"fa499a35-59bb-4ee1-93b4-98ab890c2126\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.722062 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfgpl\" (UniqueName: \"kubernetes.io/projected/fa499a35-59bb-4ee1-93b4-98ab890c2126-kube-api-access-xfgpl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zcz2d\" (UID: \"fa499a35-59bb-4ee1-93b4-98ab890c2126\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.723159 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa499a35-59bb-4ee1-93b4-98ab890c2126-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zcz2d\" (UID: \"fa499a35-59bb-4ee1-93b4-98ab890c2126\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.724446 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fa499a35-59bb-4ee1-93b4-98ab890c2126-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zcz2d\" (UID: \"fa499a35-59bb-4ee1-93b4-98ab890c2126\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.728347 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fa499a35-59bb-4ee1-93b4-98ab890c2126-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zcz2d\" (UID: \"fa499a35-59bb-4ee1-93b4-98ab890c2126\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.728420 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa499a35-59bb-4ee1-93b4-98ab890c2126-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zcz2d\" (UID: \"fa499a35-59bb-4ee1-93b4-98ab890c2126\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.757502 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfgpl\" (UniqueName: \"kubernetes.io/projected/fa499a35-59bb-4ee1-93b4-98ab890c2126-kube-api-access-xfgpl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zcz2d\" (UID: \"fa499a35-59bb-4ee1-93b4-98ab890c2126\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d" Jan 28 07:21:44 crc kubenswrapper[4776]: I0128 07:21:44.843396 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d" Jan 28 07:21:45 crc kubenswrapper[4776]: I0128 07:21:45.416675 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d"] Jan 28 07:21:46 crc kubenswrapper[4776]: I0128 07:21:46.439393 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d" event={"ID":"fa499a35-59bb-4ee1-93b4-98ab890c2126","Type":"ContainerStarted","Data":"8abf692d644fedaef88f8c383223ff0928800ac72ce4ee237e1e12a53c0d2e4e"} Jan 28 07:21:46 crc kubenswrapper[4776]: I0128 07:21:46.439706 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d" event={"ID":"fa499a35-59bb-4ee1-93b4-98ab890c2126","Type":"ContainerStarted","Data":"7ea8d0a1267ce315f9df09909aecfc89e508460919a3f058bd5b8f8e1b073575"} Jan 28 07:21:48 crc kubenswrapper[4776]: I0128 07:21:48.611080 4776 scope.go:117] "RemoveContainer" containerID="2a2fcc8f96b985d910f5b4e5f353da1467c1090ce018f09c73fe3b6641d262f4" Jan 28 07:21:53 crc kubenswrapper[4776]: I0128 07:21:53.305780 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:21:53 crc kubenswrapper[4776]: E0128 07:21:53.306660 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:21:55 crc kubenswrapper[4776]: I0128 07:21:55.533890 4776 generic.go:334] "Generic (PLEG): container finished" podID="fa499a35-59bb-4ee1-93b4-98ab890c2126" containerID="8abf692d644fedaef88f8c383223ff0928800ac72ce4ee237e1e12a53c0d2e4e" exitCode=0 Jan 28 07:21:55 crc kubenswrapper[4776]: I0128 07:21:55.533993 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d" event={"ID":"fa499a35-59bb-4ee1-93b4-98ab890c2126","Type":"ContainerDied","Data":"8abf692d644fedaef88f8c383223ff0928800ac72ce4ee237e1e12a53c0d2e4e"} Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.034530 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.152531 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fa499a35-59bb-4ee1-93b4-98ab890c2126-ssh-key-openstack-edpm-ipam\") pod \"fa499a35-59bb-4ee1-93b4-98ab890c2126\" (UID: \"fa499a35-59bb-4ee1-93b4-98ab890c2126\") " Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.152876 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfgpl\" (UniqueName: \"kubernetes.io/projected/fa499a35-59bb-4ee1-93b4-98ab890c2126-kube-api-access-xfgpl\") pod \"fa499a35-59bb-4ee1-93b4-98ab890c2126\" (UID: \"fa499a35-59bb-4ee1-93b4-98ab890c2126\") " Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.153017 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa499a35-59bb-4ee1-93b4-98ab890c2126-inventory\") pod \"fa499a35-59bb-4ee1-93b4-98ab890c2126\" (UID: \"fa499a35-59bb-4ee1-93b4-98ab890c2126\") " Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.157863 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa499a35-59bb-4ee1-93b4-98ab890c2126-kube-api-access-xfgpl" (OuterVolumeSpecName: "kube-api-access-xfgpl") pod "fa499a35-59bb-4ee1-93b4-98ab890c2126" (UID: "fa499a35-59bb-4ee1-93b4-98ab890c2126"). InnerVolumeSpecName "kube-api-access-xfgpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.178664 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa499a35-59bb-4ee1-93b4-98ab890c2126-inventory" (OuterVolumeSpecName: "inventory") pod "fa499a35-59bb-4ee1-93b4-98ab890c2126" (UID: "fa499a35-59bb-4ee1-93b4-98ab890c2126"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.179068 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa499a35-59bb-4ee1-93b4-98ab890c2126-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fa499a35-59bb-4ee1-93b4-98ab890c2126" (UID: "fa499a35-59bb-4ee1-93b4-98ab890c2126"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.255160 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fa499a35-59bb-4ee1-93b4-98ab890c2126-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.255202 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfgpl\" (UniqueName: \"kubernetes.io/projected/fa499a35-59bb-4ee1-93b4-98ab890c2126-kube-api-access-xfgpl\") on node \"crc\" DevicePath \"\"" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.255214 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa499a35-59bb-4ee1-93b4-98ab890c2126-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.553378 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d" event={"ID":"fa499a35-59bb-4ee1-93b4-98ab890c2126","Type":"ContainerDied","Data":"7ea8d0a1267ce315f9df09909aecfc89e508460919a3f058bd5b8f8e1b073575"} Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.553417 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ea8d0a1267ce315f9df09909aecfc89e508460919a3f058bd5b8f8e1b073575" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.553490 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zcz2d" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.655550 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm"] Jan 28 07:21:57 crc kubenswrapper[4776]: E0128 07:21:57.655988 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa499a35-59bb-4ee1-93b4-98ab890c2126" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.656010 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa499a35-59bb-4ee1-93b4-98ab890c2126" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.656266 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa499a35-59bb-4ee1-93b4-98ab890c2126" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.656954 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.660182 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.660288 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cl6qn" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.660507 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.661490 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.685140 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm"] Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.764465 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs6rg\" (UniqueName: \"kubernetes.io/projected/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-kube-api-access-qs6rg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm\" (UID: \"e7a8bf84-fa6c-4637-bac6-cb9da6206f31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.764669 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm\" (UID: \"e7a8bf84-fa6c-4637-bac6-cb9da6206f31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.764713 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm\" (UID: \"e7a8bf84-fa6c-4637-bac6-cb9da6206f31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.867219 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm\" (UID: \"e7a8bf84-fa6c-4637-bac6-cb9da6206f31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.867556 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm\" (UID: \"e7a8bf84-fa6c-4637-bac6-cb9da6206f31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.867927 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs6rg\" (UniqueName: \"kubernetes.io/projected/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-kube-api-access-qs6rg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm\" (UID: \"e7a8bf84-fa6c-4637-bac6-cb9da6206f31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.871746 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm\" (UID: \"e7a8bf84-fa6c-4637-bac6-cb9da6206f31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.872300 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm\" (UID: \"e7a8bf84-fa6c-4637-bac6-cb9da6206f31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.890537 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs6rg\" (UniqueName: \"kubernetes.io/projected/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-kube-api-access-qs6rg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm\" (UID: \"e7a8bf84-fa6c-4637-bac6-cb9da6206f31\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" Jan 28 07:21:57 crc kubenswrapper[4776]: I0128 07:21:57.976043 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" Jan 28 07:21:58 crc kubenswrapper[4776]: I0128 07:21:58.530841 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm"] Jan 28 07:21:58 crc kubenswrapper[4776]: W0128 07:21:58.535296 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7a8bf84_fa6c_4637_bac6_cb9da6206f31.slice/crio-5e582604ec62fdc9c0c73f6f6cc068db0e6fd37b25510f12ff08f1f2aef94544 WatchSource:0}: Error finding container 5e582604ec62fdc9c0c73f6f6cc068db0e6fd37b25510f12ff08f1f2aef94544: Status 404 returned error can't find the container with id 5e582604ec62fdc9c0c73f6f6cc068db0e6fd37b25510f12ff08f1f2aef94544 Jan 28 07:21:58 crc kubenswrapper[4776]: I0128 07:21:58.565316 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" event={"ID":"e7a8bf84-fa6c-4637-bac6-cb9da6206f31","Type":"ContainerStarted","Data":"5e582604ec62fdc9c0c73f6f6cc068db0e6fd37b25510f12ff08f1f2aef94544"} Jan 28 07:21:59 crc kubenswrapper[4776]: I0128 07:21:59.578414 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" event={"ID":"e7a8bf84-fa6c-4637-bac6-cb9da6206f31","Type":"ContainerStarted","Data":"43ecf269d155d76dfce23486d2e62a64f6a9bf3d73a928a021b4ed0621b86e8d"} Jan 28 07:21:59 crc kubenswrapper[4776]: I0128 07:21:59.598641 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" podStartSLOduration=2.103514599 podStartE2EDuration="2.598615839s" podCreationTimestamp="2026-01-28 07:21:57 +0000 UTC" firstStartedPulling="2026-01-28 07:21:58.537853993 +0000 UTC m=+1889.953514163" lastFinishedPulling="2026-01-28 07:21:59.032955243 +0000 UTC m=+1890.448615403" observedRunningTime="2026-01-28 07:21:59.595431094 +0000 UTC m=+1891.011091364" watchObservedRunningTime="2026-01-28 07:21:59.598615839 +0000 UTC m=+1891.014275999" Jan 28 07:22:06 crc kubenswrapper[4776]: I0128 07:22:06.304584 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:22:06 crc kubenswrapper[4776]: E0128 07:22:06.305519 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:22:10 crc kubenswrapper[4776]: I0128 07:22:10.677970 4776 generic.go:334] "Generic (PLEG): container finished" podID="e7a8bf84-fa6c-4637-bac6-cb9da6206f31" containerID="43ecf269d155d76dfce23486d2e62a64f6a9bf3d73a928a021b4ed0621b86e8d" exitCode=0 Jan 28 07:22:10 crc kubenswrapper[4776]: I0128 07:22:10.678066 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" event={"ID":"e7a8bf84-fa6c-4637-bac6-cb9da6206f31","Type":"ContainerDied","Data":"43ecf269d155d76dfce23486d2e62a64f6a9bf3d73a928a021b4ed0621b86e8d"} Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.129794 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.172515 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs6rg\" (UniqueName: \"kubernetes.io/projected/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-kube-api-access-qs6rg\") pod \"e7a8bf84-fa6c-4637-bac6-cb9da6206f31\" (UID: \"e7a8bf84-fa6c-4637-bac6-cb9da6206f31\") " Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.172889 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-inventory\") pod \"e7a8bf84-fa6c-4637-bac6-cb9da6206f31\" (UID: \"e7a8bf84-fa6c-4637-bac6-cb9da6206f31\") " Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.172919 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-ssh-key-openstack-edpm-ipam\") pod \"e7a8bf84-fa6c-4637-bac6-cb9da6206f31\" (UID: \"e7a8bf84-fa6c-4637-bac6-cb9da6206f31\") " Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.183818 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-kube-api-access-qs6rg" (OuterVolumeSpecName: "kube-api-access-qs6rg") pod "e7a8bf84-fa6c-4637-bac6-cb9da6206f31" (UID: "e7a8bf84-fa6c-4637-bac6-cb9da6206f31"). InnerVolumeSpecName "kube-api-access-qs6rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.202800 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e7a8bf84-fa6c-4637-bac6-cb9da6206f31" (UID: "e7a8bf84-fa6c-4637-bac6-cb9da6206f31"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.211247 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-inventory" (OuterVolumeSpecName: "inventory") pod "e7a8bf84-fa6c-4637-bac6-cb9da6206f31" (UID: "e7a8bf84-fa6c-4637-bac6-cb9da6206f31"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.277572 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs6rg\" (UniqueName: \"kubernetes.io/projected/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-kube-api-access-qs6rg\") on node \"crc\" DevicePath \"\"" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.277614 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.277627 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7a8bf84-fa6c-4637-bac6-cb9da6206f31-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.708767 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" event={"ID":"e7a8bf84-fa6c-4637-bac6-cb9da6206f31","Type":"ContainerDied","Data":"5e582604ec62fdc9c0c73f6f6cc068db0e6fd37b25510f12ff08f1f2aef94544"} Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.708816 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e582604ec62fdc9c0c73f6f6cc068db0e6fd37b25510f12ff08f1f2aef94544" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.708893 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.852844 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw"] Jan 28 07:22:12 crc kubenswrapper[4776]: E0128 07:22:12.853475 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a8bf84-fa6c-4637-bac6-cb9da6206f31" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.853508 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a8bf84-fa6c-4637-bac6-cb9da6206f31" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.853838 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7a8bf84-fa6c-4637-bac6-cb9da6206f31" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.854739 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.859005 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.859756 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.860665 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.860687 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.860725 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cl6qn" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.860732 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.860682 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.860918 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 07:22:12 crc kubenswrapper[4776]: I0128 07:22:12.890292 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw"] Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.001013 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.001406 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.001572 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.001698 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.001818 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.001949 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.002117 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db8m4\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-kube-api-access-db8m4\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.002226 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.002334 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.002503 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.002670 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.002827 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.002964 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.003106 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.105291 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.105349 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.105375 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.105403 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.105441 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.105516 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db8m4\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-kube-api-access-db8m4\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.105545 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.105582 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.105620 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.105648 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.105697 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.105723 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.105756 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.105794 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.111072 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.111400 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.112820 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.113144 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.114223 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.114416 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.114464 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.116207 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.119440 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.119486 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.119769 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.120536 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.122249 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.137337 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db8m4\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-kube-api-access-db8m4\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.208276 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:22:13 crc kubenswrapper[4776]: I0128 07:22:13.835138 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw"] Jan 28 07:22:14 crc kubenswrapper[4776]: I0128 07:22:14.728419 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" event={"ID":"f05748ac-8e6e-4713-ae86-b0e4ffadec84","Type":"ContainerStarted","Data":"df1d27e1fbc134d8315a3f83dea47297c86fd5ef06dca6489edc37b639971b0c"} Jan 28 07:22:14 crc kubenswrapper[4776]: I0128 07:22:14.729004 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" event={"ID":"f05748ac-8e6e-4713-ae86-b0e4ffadec84","Type":"ContainerStarted","Data":"6ab45c30f7c8cfd1bf81a80c9c372539bd4c051240c8c79de5b4e363c174e2f0"} Jan 28 07:22:14 crc kubenswrapper[4776]: I0128 07:22:14.756491 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" podStartSLOduration=2.253414211 podStartE2EDuration="2.756472774s" podCreationTimestamp="2026-01-28 07:22:12 +0000 UTC" firstStartedPulling="2026-01-28 07:22:13.839681147 +0000 UTC m=+1905.255341307" lastFinishedPulling="2026-01-28 07:22:14.34273971 +0000 UTC m=+1905.758399870" observedRunningTime="2026-01-28 07:22:14.750419461 +0000 UTC m=+1906.166079621" watchObservedRunningTime="2026-01-28 07:22:14.756472774 +0000 UTC m=+1906.172132934" Jan 28 07:22:20 crc kubenswrapper[4776]: I0128 07:22:20.305025 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:22:20 crc kubenswrapper[4776]: E0128 07:22:20.306341 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:22:35 crc kubenswrapper[4776]: I0128 07:22:35.307660 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:22:35 crc kubenswrapper[4776]: E0128 07:22:35.308727 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:22:48 crc kubenswrapper[4776]: I0128 07:22:48.304584 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:22:48 crc kubenswrapper[4776]: E0128 07:22:48.305443 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:23:00 crc kubenswrapper[4776]: I0128 07:23:00.206116 4776 generic.go:334] "Generic (PLEG): container finished" podID="f05748ac-8e6e-4713-ae86-b0e4ffadec84" containerID="df1d27e1fbc134d8315a3f83dea47297c86fd5ef06dca6489edc37b639971b0c" exitCode=0 Jan 28 07:23:00 crc kubenswrapper[4776]: I0128 07:23:00.206199 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" event={"ID":"f05748ac-8e6e-4713-ae86-b0e4ffadec84","Type":"ContainerDied","Data":"df1d27e1fbc134d8315a3f83dea47297c86fd5ef06dca6489edc37b639971b0c"} Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.733320 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.744528 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-ssh-key-openstack-edpm-ipam\") pod \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.744628 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-telemetry-combined-ca-bundle\") pod \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.744721 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.744748 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-ovn-combined-ca-bundle\") pod \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.744897 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db8m4\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-kube-api-access-db8m4\") pod \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.744930 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-ovn-default-certs-0\") pod \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.744984 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-libvirt-combined-ca-bundle\") pod \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.745008 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-bootstrap-combined-ca-bundle\") pod \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.745051 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-nova-combined-ca-bundle\") pod \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.745074 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.745134 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-inventory\") pod \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.745162 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.745179 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-repo-setup-combined-ca-bundle\") pod \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.745208 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-neutron-metadata-combined-ca-bundle\") pod \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\" (UID: \"f05748ac-8e6e-4713-ae86-b0e4ffadec84\") " Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.754766 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "f05748ac-8e6e-4713-ae86-b0e4ffadec84" (UID: "f05748ac-8e6e-4713-ae86-b0e4ffadec84"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.755574 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f05748ac-8e6e-4713-ae86-b0e4ffadec84" (UID: "f05748ac-8e6e-4713-ae86-b0e4ffadec84"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.755526 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f05748ac-8e6e-4713-ae86-b0e4ffadec84" (UID: "f05748ac-8e6e-4713-ae86-b0e4ffadec84"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.757127 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "f05748ac-8e6e-4713-ae86-b0e4ffadec84" (UID: "f05748ac-8e6e-4713-ae86-b0e4ffadec84"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.757883 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "f05748ac-8e6e-4713-ae86-b0e4ffadec84" (UID: "f05748ac-8e6e-4713-ae86-b0e4ffadec84"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.759354 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f05748ac-8e6e-4713-ae86-b0e4ffadec84" (UID: "f05748ac-8e6e-4713-ae86-b0e4ffadec84"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.759953 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f05748ac-8e6e-4713-ae86-b0e4ffadec84" (UID: "f05748ac-8e6e-4713-ae86-b0e4ffadec84"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.760610 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-kube-api-access-db8m4" (OuterVolumeSpecName: "kube-api-access-db8m4") pod "f05748ac-8e6e-4713-ae86-b0e4ffadec84" (UID: "f05748ac-8e6e-4713-ae86-b0e4ffadec84"). InnerVolumeSpecName "kube-api-access-db8m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.760712 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "f05748ac-8e6e-4713-ae86-b0e4ffadec84" (UID: "f05748ac-8e6e-4713-ae86-b0e4ffadec84"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.761408 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "f05748ac-8e6e-4713-ae86-b0e4ffadec84" (UID: "f05748ac-8e6e-4713-ae86-b0e4ffadec84"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.762190 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "f05748ac-8e6e-4713-ae86-b0e4ffadec84" (UID: "f05748ac-8e6e-4713-ae86-b0e4ffadec84"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.763185 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f05748ac-8e6e-4713-ae86-b0e4ffadec84" (UID: "f05748ac-8e6e-4713-ae86-b0e4ffadec84"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.786009 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f05748ac-8e6e-4713-ae86-b0e4ffadec84" (UID: "f05748ac-8e6e-4713-ae86-b0e4ffadec84"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.809461 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-inventory" (OuterVolumeSpecName: "inventory") pod "f05748ac-8e6e-4713-ae86-b0e4ffadec84" (UID: "f05748ac-8e6e-4713-ae86-b0e4ffadec84"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.847403 4776 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.847447 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.847463 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.847478 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.847493 4776 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.847509 4776 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.847522 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.847535 4776 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.847568 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.847582 4776 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.847595 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db8m4\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-kube-api-access-db8m4\") on node \"crc\" DevicePath \"\"" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.847607 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f05748ac-8e6e-4713-ae86-b0e4ffadec84-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.847620 4776 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:23:01 crc kubenswrapper[4776]: I0128 07:23:01.847633 4776 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05748ac-8e6e-4713-ae86-b0e4ffadec84-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.238744 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" event={"ID":"f05748ac-8e6e-4713-ae86-b0e4ffadec84","Type":"ContainerDied","Data":"6ab45c30f7c8cfd1bf81a80c9c372539bd4c051240c8c79de5b4e363c174e2f0"} Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.238816 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ab45c30f7c8cfd1bf81a80c9c372539bd4c051240c8c79de5b4e363c174e2f0" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.238922 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.304444 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:23:02 crc kubenswrapper[4776]: E0128 07:23:02.305295 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.361360 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8"] Jan 28 07:23:02 crc kubenswrapper[4776]: E0128 07:23:02.361721 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05748ac-8e6e-4713-ae86-b0e4ffadec84" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.361738 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05748ac-8e6e-4713-ae86-b0e4ffadec84" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.361914 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f05748ac-8e6e-4713-ae86-b0e4ffadec84" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.362535 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.462981 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdx87\" (UniqueName: \"kubernetes.io/projected/34505caa-b76e-404f-b71a-a863e549d905-kube-api-access-sdx87\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpxh8\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.463042 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpxh8\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.463118 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpxh8\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.463180 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpxh8\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.463220 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/34505caa-b76e-404f-b71a-a863e549d905-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpxh8\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.565799 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdx87\" (UniqueName: \"kubernetes.io/projected/34505caa-b76e-404f-b71a-a863e549d905-kube-api-access-sdx87\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpxh8\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.565874 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpxh8\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.565978 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpxh8\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.566065 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpxh8\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.566128 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/34505caa-b76e-404f-b71a-a863e549d905-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpxh8\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.573349 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.574063 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.574429 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.574883 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cl6qn" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.575122 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.580608 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/34505caa-b76e-404f-b71a-a863e549d905-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpxh8\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.585331 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpxh8\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.585852 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpxh8\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.589640 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpxh8\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.594436 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8"] Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.600423 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdx87\" (UniqueName: \"kubernetes.io/projected/34505caa-b76e-404f-b71a-a863e549d905-kube-api-access-sdx87\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpxh8\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:23:02 crc kubenswrapper[4776]: I0128 07:23:02.679200 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:23:03 crc kubenswrapper[4776]: I0128 07:23:03.219110 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8"] Jan 28 07:23:03 crc kubenswrapper[4776]: W0128 07:23:03.221359 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34505caa_b76e_404f_b71a_a863e549d905.slice/crio-fb2183dee9642f71ca4b318c593929b8ddefef6836593eadfed55569886fd790 WatchSource:0}: Error finding container fb2183dee9642f71ca4b318c593929b8ddefef6836593eadfed55569886fd790: Status 404 returned error can't find the container with id fb2183dee9642f71ca4b318c593929b8ddefef6836593eadfed55569886fd790 Jan 28 07:23:03 crc kubenswrapper[4776]: I0128 07:23:03.251361 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" event={"ID":"34505caa-b76e-404f-b71a-a863e549d905","Type":"ContainerStarted","Data":"fb2183dee9642f71ca4b318c593929b8ddefef6836593eadfed55569886fd790"} Jan 28 07:23:04 crc kubenswrapper[4776]: I0128 07:23:04.264113 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" event={"ID":"34505caa-b76e-404f-b71a-a863e549d905","Type":"ContainerStarted","Data":"89faf9629eeb49010305b970bd416ec5e82b44995622aaccda4b3ab72df8078c"} Jan 28 07:23:04 crc kubenswrapper[4776]: I0128 07:23:04.295272 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" podStartSLOduration=1.7232441139999999 podStartE2EDuration="2.29525173s" podCreationTimestamp="2026-01-28 07:23:02 +0000 UTC" firstStartedPulling="2026-01-28 07:23:03.22459487 +0000 UTC m=+1954.640255070" lastFinishedPulling="2026-01-28 07:23:03.796602486 +0000 UTC m=+1955.212262686" observedRunningTime="2026-01-28 07:23:04.282010487 +0000 UTC m=+1955.697670657" watchObservedRunningTime="2026-01-28 07:23:04.29525173 +0000 UTC m=+1955.710911900" Jan 28 07:23:13 crc kubenswrapper[4776]: I0128 07:23:13.305147 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:23:13 crc kubenswrapper[4776]: E0128 07:23:13.306363 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:23:25 crc kubenswrapper[4776]: I0128 07:23:25.305478 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:23:25 crc kubenswrapper[4776]: E0128 07:23:25.306319 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:23:36 crc kubenswrapper[4776]: I0128 07:23:36.305381 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:23:36 crc kubenswrapper[4776]: I0128 07:23:36.610183 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"de7aa249c85a2b16467bbe0be5a794768c1a7b699134487f0a7e58ea7b6360b4"} Jan 28 07:24:20 crc kubenswrapper[4776]: I0128 07:24:20.047813 4776 generic.go:334] "Generic (PLEG): container finished" podID="34505caa-b76e-404f-b71a-a863e549d905" containerID="89faf9629eeb49010305b970bd416ec5e82b44995622aaccda4b3ab72df8078c" exitCode=0 Jan 28 07:24:20 crc kubenswrapper[4776]: I0128 07:24:20.047893 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" event={"ID":"34505caa-b76e-404f-b71a-a863e549d905","Type":"ContainerDied","Data":"89faf9629eeb49010305b970bd416ec5e82b44995622aaccda4b3ab72df8078c"} Jan 28 07:24:21 crc kubenswrapper[4776]: I0128 07:24:21.600468 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:24:21 crc kubenswrapper[4776]: I0128 07:24:21.748256 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdx87\" (UniqueName: \"kubernetes.io/projected/34505caa-b76e-404f-b71a-a863e549d905-kube-api-access-sdx87\") pod \"34505caa-b76e-404f-b71a-a863e549d905\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " Jan 28 07:24:21 crc kubenswrapper[4776]: I0128 07:24:21.748484 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-inventory\") pod \"34505caa-b76e-404f-b71a-a863e549d905\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " Jan 28 07:24:21 crc kubenswrapper[4776]: I0128 07:24:21.748573 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/34505caa-b76e-404f-b71a-a863e549d905-ovncontroller-config-0\") pod \"34505caa-b76e-404f-b71a-a863e549d905\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " Jan 28 07:24:21 crc kubenswrapper[4776]: I0128 07:24:21.748679 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-ssh-key-openstack-edpm-ipam\") pod \"34505caa-b76e-404f-b71a-a863e549d905\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " Jan 28 07:24:21 crc kubenswrapper[4776]: I0128 07:24:21.748728 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-ovn-combined-ca-bundle\") pod \"34505caa-b76e-404f-b71a-a863e549d905\" (UID: \"34505caa-b76e-404f-b71a-a863e549d905\") " Jan 28 07:24:21 crc kubenswrapper[4776]: I0128 07:24:21.753396 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "34505caa-b76e-404f-b71a-a863e549d905" (UID: "34505caa-b76e-404f-b71a-a863e549d905"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:24:21 crc kubenswrapper[4776]: I0128 07:24:21.763870 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34505caa-b76e-404f-b71a-a863e549d905-kube-api-access-sdx87" (OuterVolumeSpecName: "kube-api-access-sdx87") pod "34505caa-b76e-404f-b71a-a863e549d905" (UID: "34505caa-b76e-404f-b71a-a863e549d905"). InnerVolumeSpecName "kube-api-access-sdx87". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:24:21 crc kubenswrapper[4776]: I0128 07:24:21.778378 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "34505caa-b76e-404f-b71a-a863e549d905" (UID: "34505caa-b76e-404f-b71a-a863e549d905"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:24:21 crc kubenswrapper[4776]: I0128 07:24:21.782587 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-inventory" (OuterVolumeSpecName: "inventory") pod "34505caa-b76e-404f-b71a-a863e549d905" (UID: "34505caa-b76e-404f-b71a-a863e549d905"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:24:21 crc kubenswrapper[4776]: I0128 07:24:21.800382 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34505caa-b76e-404f-b71a-a863e549d905-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "34505caa-b76e-404f-b71a-a863e549d905" (UID: "34505caa-b76e-404f-b71a-a863e549d905"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:24:21 crc kubenswrapper[4776]: I0128 07:24:21.851572 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdx87\" (UniqueName: \"kubernetes.io/projected/34505caa-b76e-404f-b71a-a863e549d905-kube-api-access-sdx87\") on node \"crc\" DevicePath \"\"" Jan 28 07:24:21 crc kubenswrapper[4776]: I0128 07:24:21.851609 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 07:24:21 crc kubenswrapper[4776]: I0128 07:24:21.851619 4776 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/34505caa-b76e-404f-b71a-a863e549d905-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:24:21 crc kubenswrapper[4776]: I0128 07:24:21.851627 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:24:21 crc kubenswrapper[4776]: I0128 07:24:21.851637 4776 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34505caa-b76e-404f-b71a-a863e549d905-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.074032 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" event={"ID":"34505caa-b76e-404f-b71a-a863e549d905","Type":"ContainerDied","Data":"fb2183dee9642f71ca4b318c593929b8ddefef6836593eadfed55569886fd790"} Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.074507 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb2183dee9642f71ca4b318c593929b8ddefef6836593eadfed55569886fd790" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.074125 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpxh8" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.246817 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg"] Jan 28 07:24:22 crc kubenswrapper[4776]: E0128 07:24:22.247533 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34505caa-b76e-404f-b71a-a863e549d905" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.247705 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="34505caa-b76e-404f-b71a-a863e549d905" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.248276 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="34505caa-b76e-404f-b71a-a863e549d905" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.249131 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.251856 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.251879 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.252228 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.254050 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.254333 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cl6qn" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.254670 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.258263 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg"] Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.361013 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.361106 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj7q5\" (UniqueName: \"kubernetes.io/projected/36edfabc-d31a-4c3f-98d0-3c830a282c65-kube-api-access-pj7q5\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.361172 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.361265 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.361368 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.361410 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.463802 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj7q5\" (UniqueName: \"kubernetes.io/projected/36edfabc-d31a-4c3f-98d0-3c830a282c65-kube-api-access-pj7q5\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.463894 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.463973 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.464047 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.464085 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.464208 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.468468 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.468727 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.469171 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.470196 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.476404 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.490918 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj7q5\" (UniqueName: \"kubernetes.io/projected/36edfabc-d31a-4c3f-98d0-3c830a282c65-kube-api-access-pj7q5\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:22 crc kubenswrapper[4776]: I0128 07:24:22.567593 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:24:23 crc kubenswrapper[4776]: I0128 07:24:23.131187 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg"] Jan 28 07:24:24 crc kubenswrapper[4776]: I0128 07:24:24.116101 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" event={"ID":"36edfabc-d31a-4c3f-98d0-3c830a282c65","Type":"ContainerStarted","Data":"8f87ff914b619ccfb2fac485cbd459f6055b887b477cf4687f1577bf505b0e50"} Jan 28 07:24:25 crc kubenswrapper[4776]: I0128 07:24:25.132852 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" event={"ID":"36edfabc-d31a-4c3f-98d0-3c830a282c65","Type":"ContainerStarted","Data":"c79ffcffd7c2f86d27306fa3a2ca98c9b1760645d489895806025beaa02b8cf0"} Jan 28 07:24:25 crc kubenswrapper[4776]: I0128 07:24:25.158637 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" podStartSLOduration=2.365009553 podStartE2EDuration="3.158608525s" podCreationTimestamp="2026-01-28 07:24:22 +0000 UTC" firstStartedPulling="2026-01-28 07:24:23.141644379 +0000 UTC m=+2034.557304539" lastFinishedPulling="2026-01-28 07:24:23.935243351 +0000 UTC m=+2035.350903511" observedRunningTime="2026-01-28 07:24:25.157728472 +0000 UTC m=+2036.573388632" watchObservedRunningTime="2026-01-28 07:24:25.158608525 +0000 UTC m=+2036.574268725" Jan 28 07:24:36 crc kubenswrapper[4776]: I0128 07:24:36.535884 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bl4vb"] Jan 28 07:24:36 crc kubenswrapper[4776]: I0128 07:24:36.541480 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bl4vb" Jan 28 07:24:36 crc kubenswrapper[4776]: I0128 07:24:36.551739 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bl4vb"] Jan 28 07:24:36 crc kubenswrapper[4776]: I0128 07:24:36.652064 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b09add3-701c-4c0e-ac04-0e974a7bec0d-catalog-content\") pod \"community-operators-bl4vb\" (UID: \"5b09add3-701c-4c0e-ac04-0e974a7bec0d\") " pod="openshift-marketplace/community-operators-bl4vb" Jan 28 07:24:36 crc kubenswrapper[4776]: I0128 07:24:36.652112 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b09add3-701c-4c0e-ac04-0e974a7bec0d-utilities\") pod \"community-operators-bl4vb\" (UID: \"5b09add3-701c-4c0e-ac04-0e974a7bec0d\") " pod="openshift-marketplace/community-operators-bl4vb" Jan 28 07:24:36 crc kubenswrapper[4776]: I0128 07:24:36.652142 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znv7v\" (UniqueName: \"kubernetes.io/projected/5b09add3-701c-4c0e-ac04-0e974a7bec0d-kube-api-access-znv7v\") pod \"community-operators-bl4vb\" (UID: \"5b09add3-701c-4c0e-ac04-0e974a7bec0d\") " pod="openshift-marketplace/community-operators-bl4vb" Jan 28 07:24:36 crc kubenswrapper[4776]: I0128 07:24:36.754325 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b09add3-701c-4c0e-ac04-0e974a7bec0d-catalog-content\") pod \"community-operators-bl4vb\" (UID: \"5b09add3-701c-4c0e-ac04-0e974a7bec0d\") " pod="openshift-marketplace/community-operators-bl4vb" Jan 28 07:24:36 crc kubenswrapper[4776]: I0128 07:24:36.754369 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b09add3-701c-4c0e-ac04-0e974a7bec0d-utilities\") pod \"community-operators-bl4vb\" (UID: \"5b09add3-701c-4c0e-ac04-0e974a7bec0d\") " pod="openshift-marketplace/community-operators-bl4vb" Jan 28 07:24:36 crc kubenswrapper[4776]: I0128 07:24:36.754406 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znv7v\" (UniqueName: \"kubernetes.io/projected/5b09add3-701c-4c0e-ac04-0e974a7bec0d-kube-api-access-znv7v\") pod \"community-operators-bl4vb\" (UID: \"5b09add3-701c-4c0e-ac04-0e974a7bec0d\") " pod="openshift-marketplace/community-operators-bl4vb" Jan 28 07:24:36 crc kubenswrapper[4776]: I0128 07:24:36.755247 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b09add3-701c-4c0e-ac04-0e974a7bec0d-catalog-content\") pod \"community-operators-bl4vb\" (UID: \"5b09add3-701c-4c0e-ac04-0e974a7bec0d\") " pod="openshift-marketplace/community-operators-bl4vb" Jan 28 07:24:36 crc kubenswrapper[4776]: I0128 07:24:36.755520 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b09add3-701c-4c0e-ac04-0e974a7bec0d-utilities\") pod \"community-operators-bl4vb\" (UID: \"5b09add3-701c-4c0e-ac04-0e974a7bec0d\") " pod="openshift-marketplace/community-operators-bl4vb" Jan 28 07:24:36 crc kubenswrapper[4776]: I0128 07:24:36.786157 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znv7v\" (UniqueName: \"kubernetes.io/projected/5b09add3-701c-4c0e-ac04-0e974a7bec0d-kube-api-access-znv7v\") pod \"community-operators-bl4vb\" (UID: \"5b09add3-701c-4c0e-ac04-0e974a7bec0d\") " pod="openshift-marketplace/community-operators-bl4vb" Jan 28 07:24:36 crc kubenswrapper[4776]: I0128 07:24:36.865793 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bl4vb" Jan 28 07:24:37 crc kubenswrapper[4776]: I0128 07:24:37.464895 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bl4vb"] Jan 28 07:24:37 crc kubenswrapper[4776]: W0128 07:24:37.471934 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b09add3_701c_4c0e_ac04_0e974a7bec0d.slice/crio-73864fa6278ecb5fd1b76f457ec6a06d1fa41c1791cb6bd32ccdce62f264b6d2 WatchSource:0}: Error finding container 73864fa6278ecb5fd1b76f457ec6a06d1fa41c1791cb6bd32ccdce62f264b6d2: Status 404 returned error can't find the container with id 73864fa6278ecb5fd1b76f457ec6a06d1fa41c1791cb6bd32ccdce62f264b6d2 Jan 28 07:24:38 crc kubenswrapper[4776]: I0128 07:24:38.271440 4776 generic.go:334] "Generic (PLEG): container finished" podID="5b09add3-701c-4c0e-ac04-0e974a7bec0d" containerID="26da24f977272065c55f13c02f6e6f5f6c4d4e9b2587dbc19e663f0d5438f7d9" exitCode=0 Jan 28 07:24:38 crc kubenswrapper[4776]: I0128 07:24:38.271668 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bl4vb" event={"ID":"5b09add3-701c-4c0e-ac04-0e974a7bec0d","Type":"ContainerDied","Data":"26da24f977272065c55f13c02f6e6f5f6c4d4e9b2587dbc19e663f0d5438f7d9"} Jan 28 07:24:38 crc kubenswrapper[4776]: I0128 07:24:38.271765 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bl4vb" event={"ID":"5b09add3-701c-4c0e-ac04-0e974a7bec0d","Type":"ContainerStarted","Data":"73864fa6278ecb5fd1b76f457ec6a06d1fa41c1791cb6bd32ccdce62f264b6d2"} Jan 28 07:24:43 crc kubenswrapper[4776]: I0128 07:24:43.321494 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bl4vb" event={"ID":"5b09add3-701c-4c0e-ac04-0e974a7bec0d","Type":"ContainerStarted","Data":"ed4050dcab7d890d120de3a02d841159d88700cdd605ed44463f4c6aa3d1085b"} Jan 28 07:24:44 crc kubenswrapper[4776]: I0128 07:24:44.332573 4776 generic.go:334] "Generic (PLEG): container finished" podID="5b09add3-701c-4c0e-ac04-0e974a7bec0d" containerID="ed4050dcab7d890d120de3a02d841159d88700cdd605ed44463f4c6aa3d1085b" exitCode=0 Jan 28 07:24:44 crc kubenswrapper[4776]: I0128 07:24:44.332635 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bl4vb" event={"ID":"5b09add3-701c-4c0e-ac04-0e974a7bec0d","Type":"ContainerDied","Data":"ed4050dcab7d890d120de3a02d841159d88700cdd605ed44463f4c6aa3d1085b"} Jan 28 07:24:45 crc kubenswrapper[4776]: I0128 07:24:45.346751 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bl4vb" event={"ID":"5b09add3-701c-4c0e-ac04-0e974a7bec0d","Type":"ContainerStarted","Data":"f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c"} Jan 28 07:24:45 crc kubenswrapper[4776]: I0128 07:24:45.418496 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bl4vb" podStartSLOduration=2.765926368 podStartE2EDuration="9.418467742s" podCreationTimestamp="2026-01-28 07:24:36 +0000 UTC" firstStartedPulling="2026-01-28 07:24:38.27425995 +0000 UTC m=+2049.689920120" lastFinishedPulling="2026-01-28 07:24:44.926801324 +0000 UTC m=+2056.342461494" observedRunningTime="2026-01-28 07:24:45.410628872 +0000 UTC m=+2056.826289042" watchObservedRunningTime="2026-01-28 07:24:45.418467742 +0000 UTC m=+2056.834127912" Jan 28 07:24:46 crc kubenswrapper[4776]: I0128 07:24:46.866673 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bl4vb" Jan 28 07:24:46 crc kubenswrapper[4776]: I0128 07:24:46.867025 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bl4vb" Jan 28 07:24:47 crc kubenswrapper[4776]: I0128 07:24:47.913024 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-bl4vb" podUID="5b09add3-701c-4c0e-ac04-0e974a7bec0d" containerName="registry-server" probeResult="failure" output=< Jan 28 07:24:47 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Jan 28 07:24:47 crc kubenswrapper[4776]: > Jan 28 07:24:56 crc kubenswrapper[4776]: I0128 07:24:56.960585 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bl4vb" Jan 28 07:24:57 crc kubenswrapper[4776]: I0128 07:24:57.021624 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bl4vb" Jan 28 07:24:57 crc kubenswrapper[4776]: I0128 07:24:57.099902 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bl4vb"] Jan 28 07:24:57 crc kubenswrapper[4776]: I0128 07:24:57.199885 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x677g"] Jan 28 07:24:57 crc kubenswrapper[4776]: I0128 07:24:57.200113 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x677g" podUID="1b7edb3b-4ee6-4b78-b0de-3715cc630ab5" containerName="registry-server" containerID="cri-o://b19529727f27f20eabe1df5fca283e6dbd879334f832f105eb9cade7f114efea" gracePeriod=2 Jan 28 07:24:57 crc kubenswrapper[4776]: I0128 07:24:57.474478 4776 generic.go:334] "Generic (PLEG): container finished" podID="1b7edb3b-4ee6-4b78-b0de-3715cc630ab5" containerID="b19529727f27f20eabe1df5fca283e6dbd879334f832f105eb9cade7f114efea" exitCode=0 Jan 28 07:24:57 crc kubenswrapper[4776]: I0128 07:24:57.474589 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x677g" event={"ID":"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5","Type":"ContainerDied","Data":"b19529727f27f20eabe1df5fca283e6dbd879334f832f105eb9cade7f114efea"} Jan 28 07:24:57 crc kubenswrapper[4776]: I0128 07:24:57.699238 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x677g" Jan 28 07:24:57 crc kubenswrapper[4776]: I0128 07:24:57.802636 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-catalog-content\") pod \"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5\" (UID: \"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5\") " Jan 28 07:24:57 crc kubenswrapper[4776]: I0128 07:24:57.803001 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv6qq\" (UniqueName: \"kubernetes.io/projected/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-kube-api-access-xv6qq\") pod \"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5\" (UID: \"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5\") " Jan 28 07:24:57 crc kubenswrapper[4776]: I0128 07:24:57.803141 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-utilities\") pod \"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5\" (UID: \"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5\") " Jan 28 07:24:57 crc kubenswrapper[4776]: I0128 07:24:57.805174 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-utilities" (OuterVolumeSpecName: "utilities") pod "1b7edb3b-4ee6-4b78-b0de-3715cc630ab5" (UID: "1b7edb3b-4ee6-4b78-b0de-3715cc630ab5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:24:57 crc kubenswrapper[4776]: I0128 07:24:57.811683 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-kube-api-access-xv6qq" (OuterVolumeSpecName: "kube-api-access-xv6qq") pod "1b7edb3b-4ee6-4b78-b0de-3715cc630ab5" (UID: "1b7edb3b-4ee6-4b78-b0de-3715cc630ab5"). InnerVolumeSpecName "kube-api-access-xv6qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:24:57 crc kubenswrapper[4776]: I0128 07:24:57.900130 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b7edb3b-4ee6-4b78-b0de-3715cc630ab5" (UID: "1b7edb3b-4ee6-4b78-b0de-3715cc630ab5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:24:57 crc kubenswrapper[4776]: I0128 07:24:57.905460 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv6qq\" (UniqueName: \"kubernetes.io/projected/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-kube-api-access-xv6qq\") on node \"crc\" DevicePath \"\"" Jan 28 07:24:57 crc kubenswrapper[4776]: I0128 07:24:57.905498 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:24:57 crc kubenswrapper[4776]: I0128 07:24:57.905508 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:24:58 crc kubenswrapper[4776]: I0128 07:24:58.485808 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x677g" event={"ID":"1b7edb3b-4ee6-4b78-b0de-3715cc630ab5","Type":"ContainerDied","Data":"1a56efb7eacc85eaa6734f94881019cab2996e28a71a2234bfc661ede28b2fd4"} Jan 28 07:24:58 crc kubenswrapper[4776]: I0128 07:24:58.485881 4776 scope.go:117] "RemoveContainer" containerID="b19529727f27f20eabe1df5fca283e6dbd879334f832f105eb9cade7f114efea" Jan 28 07:24:58 crc kubenswrapper[4776]: I0128 07:24:58.486004 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x677g" Jan 28 07:24:58 crc kubenswrapper[4776]: I0128 07:24:58.513234 4776 scope.go:117] "RemoveContainer" containerID="e6b2383f0420ea8f2351d6a92f6414954af6b36322072384367e1f79a7b7d455" Jan 28 07:24:58 crc kubenswrapper[4776]: I0128 07:24:58.523528 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x677g"] Jan 28 07:24:58 crc kubenswrapper[4776]: I0128 07:24:58.531716 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x677g"] Jan 28 07:24:58 crc kubenswrapper[4776]: I0128 07:24:58.546018 4776 scope.go:117] "RemoveContainer" containerID="2dfdbc0a4d813d18fda08d0e643ee92e65af8c0742c4053502fb15ed1580e7a9" Jan 28 07:24:59 crc kubenswrapper[4776]: I0128 07:24:59.321248 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b7edb3b-4ee6-4b78-b0de-3715cc630ab5" path="/var/lib/kubelet/pods/1b7edb3b-4ee6-4b78-b0de-3715cc630ab5/volumes" Jan 28 07:25:21 crc kubenswrapper[4776]: I0128 07:25:21.717288 4776 generic.go:334] "Generic (PLEG): container finished" podID="36edfabc-d31a-4c3f-98d0-3c830a282c65" containerID="c79ffcffd7c2f86d27306fa3a2ca98c9b1760645d489895806025beaa02b8cf0" exitCode=0 Jan 28 07:25:21 crc kubenswrapper[4776]: I0128 07:25:21.717483 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" event={"ID":"36edfabc-d31a-4c3f-98d0-3c830a282c65","Type":"ContainerDied","Data":"c79ffcffd7c2f86d27306fa3a2ca98c9b1760645d489895806025beaa02b8cf0"} Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.139055 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.215098 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-nova-metadata-neutron-config-0\") pod \"36edfabc-d31a-4c3f-98d0-3c830a282c65\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.215171 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-ssh-key-openstack-edpm-ipam\") pod \"36edfabc-d31a-4c3f-98d0-3c830a282c65\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.215208 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj7q5\" (UniqueName: \"kubernetes.io/projected/36edfabc-d31a-4c3f-98d0-3c830a282c65-kube-api-access-pj7q5\") pod \"36edfabc-d31a-4c3f-98d0-3c830a282c65\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.215251 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-inventory\") pod \"36edfabc-d31a-4c3f-98d0-3c830a282c65\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.215310 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-neutron-ovn-metadata-agent-neutron-config-0\") pod \"36edfabc-d31a-4c3f-98d0-3c830a282c65\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.215340 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-neutron-metadata-combined-ca-bundle\") pod \"36edfabc-d31a-4c3f-98d0-3c830a282c65\" (UID: \"36edfabc-d31a-4c3f-98d0-3c830a282c65\") " Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.221777 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "36edfabc-d31a-4c3f-98d0-3c830a282c65" (UID: "36edfabc-d31a-4c3f-98d0-3c830a282c65"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.221869 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36edfabc-d31a-4c3f-98d0-3c830a282c65-kube-api-access-pj7q5" (OuterVolumeSpecName: "kube-api-access-pj7q5") pod "36edfabc-d31a-4c3f-98d0-3c830a282c65" (UID: "36edfabc-d31a-4c3f-98d0-3c830a282c65"). InnerVolumeSpecName "kube-api-access-pj7q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.245436 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "36edfabc-d31a-4c3f-98d0-3c830a282c65" (UID: "36edfabc-d31a-4c3f-98d0-3c830a282c65"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.245624 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "36edfabc-d31a-4c3f-98d0-3c830a282c65" (UID: "36edfabc-d31a-4c3f-98d0-3c830a282c65"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.250430 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "36edfabc-d31a-4c3f-98d0-3c830a282c65" (UID: "36edfabc-d31a-4c3f-98d0-3c830a282c65"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.255993 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-inventory" (OuterVolumeSpecName: "inventory") pod "36edfabc-d31a-4c3f-98d0-3c830a282c65" (UID: "36edfabc-d31a-4c3f-98d0-3c830a282c65"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.328392 4776 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.328432 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.328448 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj7q5\" (UniqueName: \"kubernetes.io/projected/36edfabc-d31a-4c3f-98d0-3c830a282c65-kube-api-access-pj7q5\") on node \"crc\" DevicePath \"\"" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.328465 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.328479 4776 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.328493 4776 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36edfabc-d31a-4c3f-98d0-3c830a282c65-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.738866 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" event={"ID":"36edfabc-d31a-4c3f-98d0-3c830a282c65","Type":"ContainerDied","Data":"8f87ff914b619ccfb2fac485cbd459f6055b887b477cf4687f1577bf505b0e50"} Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.738905 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f87ff914b619ccfb2fac485cbd459f6055b887b477cf4687f1577bf505b0e50" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.739232 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.848102 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn"] Jan 28 07:25:23 crc kubenswrapper[4776]: E0128 07:25:23.848921 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b7edb3b-4ee6-4b78-b0de-3715cc630ab5" containerName="registry-server" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.848943 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b7edb3b-4ee6-4b78-b0de-3715cc630ab5" containerName="registry-server" Jan 28 07:25:23 crc kubenswrapper[4776]: E0128 07:25:23.848969 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b7edb3b-4ee6-4b78-b0de-3715cc630ab5" containerName="extract-content" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.848979 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b7edb3b-4ee6-4b78-b0de-3715cc630ab5" containerName="extract-content" Jan 28 07:25:23 crc kubenswrapper[4776]: E0128 07:25:23.848993 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36edfabc-d31a-4c3f-98d0-3c830a282c65" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.849003 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="36edfabc-d31a-4c3f-98d0-3c830a282c65" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 28 07:25:23 crc kubenswrapper[4776]: E0128 07:25:23.849023 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b7edb3b-4ee6-4b78-b0de-3715cc630ab5" containerName="extract-utilities" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.849032 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b7edb3b-4ee6-4b78-b0de-3715cc630ab5" containerName="extract-utilities" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.849321 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="36edfabc-d31a-4c3f-98d0-3c830a282c65" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.849341 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b7edb3b-4ee6-4b78-b0de-3715cc630ab5" containerName="registry-server" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.850248 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.854283 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.854952 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cl6qn" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.855228 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.855386 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.865201 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.872088 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn"] Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.940467 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-288pn\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.940573 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-288pn\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.940897 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-288pn\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.940981 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-288pn\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:25:23 crc kubenswrapper[4776]: I0128 07:25:23.941421 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djnkk\" (UniqueName: \"kubernetes.io/projected/5e450505-d924-4be0-8491-92297f012e24-kube-api-access-djnkk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-288pn\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:25:24 crc kubenswrapper[4776]: I0128 07:25:24.043581 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-288pn\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:25:24 crc kubenswrapper[4776]: I0128 07:25:24.044108 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-288pn\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:25:24 crc kubenswrapper[4776]: I0128 07:25:24.044424 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-288pn\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:25:24 crc kubenswrapper[4776]: I0128 07:25:24.044677 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-288pn\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:25:24 crc kubenswrapper[4776]: I0128 07:25:24.044920 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djnkk\" (UniqueName: \"kubernetes.io/projected/5e450505-d924-4be0-8491-92297f012e24-kube-api-access-djnkk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-288pn\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:25:24 crc kubenswrapper[4776]: I0128 07:25:24.051213 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-288pn\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:25:24 crc kubenswrapper[4776]: I0128 07:25:24.051976 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-288pn\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:25:24 crc kubenswrapper[4776]: I0128 07:25:24.052387 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-288pn\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:25:24 crc kubenswrapper[4776]: I0128 07:25:24.062507 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-288pn\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:25:24 crc kubenswrapper[4776]: I0128 07:25:24.078174 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djnkk\" (UniqueName: \"kubernetes.io/projected/5e450505-d924-4be0-8491-92297f012e24-kube-api-access-djnkk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-288pn\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:25:24 crc kubenswrapper[4776]: I0128 07:25:24.169200 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:25:24 crc kubenswrapper[4776]: I0128 07:25:24.757065 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn"] Jan 28 07:25:25 crc kubenswrapper[4776]: I0128 07:25:25.771216 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" event={"ID":"5e450505-d924-4be0-8491-92297f012e24","Type":"ContainerStarted","Data":"d055b561edd05b71d34ab3946f7d8e39847c32ac29ecb699d9fdb0ffa23f459d"} Jan 28 07:25:26 crc kubenswrapper[4776]: I0128 07:25:26.778821 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" event={"ID":"5e450505-d924-4be0-8491-92297f012e24","Type":"ContainerStarted","Data":"345797b7485f49f5bf274f18a2f9230258b9773ca85b09c37ec68c3689a6de5b"} Jan 28 07:25:26 crc kubenswrapper[4776]: I0128 07:25:26.805209 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" podStartSLOduration=2.851401267 podStartE2EDuration="3.805185992s" podCreationTimestamp="2026-01-28 07:25:23 +0000 UTC" firstStartedPulling="2026-01-28 07:25:24.764573874 +0000 UTC m=+2096.180234024" lastFinishedPulling="2026-01-28 07:25:25.718358589 +0000 UTC m=+2097.134018749" observedRunningTime="2026-01-28 07:25:26.797130427 +0000 UTC m=+2098.212790587" watchObservedRunningTime="2026-01-28 07:25:26.805185992 +0000 UTC m=+2098.220846152" Jan 28 07:25:43 crc kubenswrapper[4776]: I0128 07:25:43.181763 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qj7gv"] Jan 28 07:25:43 crc kubenswrapper[4776]: I0128 07:25:43.184224 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qj7gv" Jan 28 07:25:43 crc kubenswrapper[4776]: I0128 07:25:43.200397 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qj7gv"] Jan 28 07:25:43 crc kubenswrapper[4776]: I0128 07:25:43.347005 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-utilities\") pod \"certified-operators-qj7gv\" (UID: \"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f\") " pod="openshift-marketplace/certified-operators-qj7gv" Jan 28 07:25:43 crc kubenswrapper[4776]: I0128 07:25:43.347272 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mczdm\" (UniqueName: \"kubernetes.io/projected/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-kube-api-access-mczdm\") pod \"certified-operators-qj7gv\" (UID: \"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f\") " pod="openshift-marketplace/certified-operators-qj7gv" Jan 28 07:25:43 crc kubenswrapper[4776]: I0128 07:25:43.347592 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-catalog-content\") pod \"certified-operators-qj7gv\" (UID: \"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f\") " pod="openshift-marketplace/certified-operators-qj7gv" Jan 28 07:25:43 crc kubenswrapper[4776]: I0128 07:25:43.449453 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-catalog-content\") pod \"certified-operators-qj7gv\" (UID: \"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f\") " pod="openshift-marketplace/certified-operators-qj7gv" Jan 28 07:25:43 crc kubenswrapper[4776]: I0128 07:25:43.449537 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-utilities\") pod \"certified-operators-qj7gv\" (UID: \"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f\") " pod="openshift-marketplace/certified-operators-qj7gv" Jan 28 07:25:43 crc kubenswrapper[4776]: I0128 07:25:43.449658 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mczdm\" (UniqueName: \"kubernetes.io/projected/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-kube-api-access-mczdm\") pod \"certified-operators-qj7gv\" (UID: \"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f\") " pod="openshift-marketplace/certified-operators-qj7gv" Jan 28 07:25:43 crc kubenswrapper[4776]: I0128 07:25:43.450105 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-utilities\") pod \"certified-operators-qj7gv\" (UID: \"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f\") " pod="openshift-marketplace/certified-operators-qj7gv" Jan 28 07:25:43 crc kubenswrapper[4776]: I0128 07:25:43.450104 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-catalog-content\") pod \"certified-operators-qj7gv\" (UID: \"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f\") " pod="openshift-marketplace/certified-operators-qj7gv" Jan 28 07:25:43 crc kubenswrapper[4776]: I0128 07:25:43.468097 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mczdm\" (UniqueName: \"kubernetes.io/projected/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-kube-api-access-mczdm\") pod \"certified-operators-qj7gv\" (UID: \"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f\") " pod="openshift-marketplace/certified-operators-qj7gv" Jan 28 07:25:43 crc kubenswrapper[4776]: I0128 07:25:43.503460 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qj7gv" Jan 28 07:25:44 crc kubenswrapper[4776]: I0128 07:25:44.015072 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qj7gv"] Jan 28 07:25:44 crc kubenswrapper[4776]: I0128 07:25:44.942562 4776 generic.go:334] "Generic (PLEG): container finished" podID="987cc4b3-fee7-455a-b9b5-7a9cabc0b03f" containerID="b7921bf95468b496aec712a006a32a53711411c2c49377d667704f6d06edf372" exitCode=0 Jan 28 07:25:44 crc kubenswrapper[4776]: I0128 07:25:44.942703 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qj7gv" event={"ID":"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f","Type":"ContainerDied","Data":"b7921bf95468b496aec712a006a32a53711411c2c49377d667704f6d06edf372"} Jan 28 07:25:44 crc kubenswrapper[4776]: I0128 07:25:44.942928 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qj7gv" event={"ID":"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f","Type":"ContainerStarted","Data":"0e03f7504c0ec6a96386436a6c86d2adbc2a919c4ebdef5eba7d414232ac7778"} Jan 28 07:25:46 crc kubenswrapper[4776]: I0128 07:25:46.964336 4776 generic.go:334] "Generic (PLEG): container finished" podID="987cc4b3-fee7-455a-b9b5-7a9cabc0b03f" containerID="204936a2dbff4b92329b266e1a188f6f280f9ffbdc701e812055252ea23b26de" exitCode=0 Jan 28 07:25:46 crc kubenswrapper[4776]: I0128 07:25:46.964438 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qj7gv" event={"ID":"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f","Type":"ContainerDied","Data":"204936a2dbff4b92329b266e1a188f6f280f9ffbdc701e812055252ea23b26de"} Jan 28 07:25:47 crc kubenswrapper[4776]: I0128 07:25:47.978255 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qj7gv" event={"ID":"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f","Type":"ContainerStarted","Data":"1f5965e06a16c85ec33ae63d9b4ac5d21db3d09bd30a43060af87d9d6b8f48b5"} Jan 28 07:25:48 crc kubenswrapper[4776]: I0128 07:25:48.000013 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qj7gv" podStartSLOduration=2.539438292 podStartE2EDuration="4.99998912s" podCreationTimestamp="2026-01-28 07:25:43 +0000 UTC" firstStartedPulling="2026-01-28 07:25:44.945836719 +0000 UTC m=+2116.361496919" lastFinishedPulling="2026-01-28 07:25:47.406387587 +0000 UTC m=+2118.822047747" observedRunningTime="2026-01-28 07:25:47.996283271 +0000 UTC m=+2119.411943431" watchObservedRunningTime="2026-01-28 07:25:47.99998912 +0000 UTC m=+2119.415649280" Jan 28 07:25:53 crc kubenswrapper[4776]: I0128 07:25:53.505279 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qj7gv" Jan 28 07:25:53 crc kubenswrapper[4776]: I0128 07:25:53.507050 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qj7gv" Jan 28 07:25:53 crc kubenswrapper[4776]: I0128 07:25:53.581144 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qj7gv" Jan 28 07:25:54 crc kubenswrapper[4776]: I0128 07:25:54.081606 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qj7gv" Jan 28 07:25:54 crc kubenswrapper[4776]: I0128 07:25:54.136874 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qj7gv"] Jan 28 07:25:56 crc kubenswrapper[4776]: I0128 07:25:56.053201 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qj7gv" podUID="987cc4b3-fee7-455a-b9b5-7a9cabc0b03f" containerName="registry-server" containerID="cri-o://1f5965e06a16c85ec33ae63d9b4ac5d21db3d09bd30a43060af87d9d6b8f48b5" gracePeriod=2 Jan 28 07:25:56 crc kubenswrapper[4776]: I0128 07:25:56.556154 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qj7gv" Jan 28 07:25:56 crc kubenswrapper[4776]: I0128 07:25:56.631191 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mczdm\" (UniqueName: \"kubernetes.io/projected/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-kube-api-access-mczdm\") pod \"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f\" (UID: \"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f\") " Jan 28 07:25:56 crc kubenswrapper[4776]: I0128 07:25:56.632937 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-catalog-content\") pod \"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f\" (UID: \"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f\") " Jan 28 07:25:56 crc kubenswrapper[4776]: I0128 07:25:56.633050 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-utilities\") pod \"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f\" (UID: \"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f\") " Jan 28 07:25:56 crc kubenswrapper[4776]: I0128 07:25:56.634081 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-utilities" (OuterVolumeSpecName: "utilities") pod "987cc4b3-fee7-455a-b9b5-7a9cabc0b03f" (UID: "987cc4b3-fee7-455a-b9b5-7a9cabc0b03f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:25:56 crc kubenswrapper[4776]: I0128 07:25:56.634236 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:25:56 crc kubenswrapper[4776]: I0128 07:25:56.641223 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-kube-api-access-mczdm" (OuterVolumeSpecName: "kube-api-access-mczdm") pod "987cc4b3-fee7-455a-b9b5-7a9cabc0b03f" (UID: "987cc4b3-fee7-455a-b9b5-7a9cabc0b03f"). InnerVolumeSpecName "kube-api-access-mczdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:25:56 crc kubenswrapper[4776]: I0128 07:25:56.691058 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "987cc4b3-fee7-455a-b9b5-7a9cabc0b03f" (UID: "987cc4b3-fee7-455a-b9b5-7a9cabc0b03f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:25:56 crc kubenswrapper[4776]: I0128 07:25:56.736079 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:25:56 crc kubenswrapper[4776]: I0128 07:25:56.736129 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mczdm\" (UniqueName: \"kubernetes.io/projected/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f-kube-api-access-mczdm\") on node \"crc\" DevicePath \"\"" Jan 28 07:25:57 crc kubenswrapper[4776]: I0128 07:25:57.063772 4776 generic.go:334] "Generic (PLEG): container finished" podID="987cc4b3-fee7-455a-b9b5-7a9cabc0b03f" containerID="1f5965e06a16c85ec33ae63d9b4ac5d21db3d09bd30a43060af87d9d6b8f48b5" exitCode=0 Jan 28 07:25:57 crc kubenswrapper[4776]: I0128 07:25:57.063847 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qj7gv" event={"ID":"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f","Type":"ContainerDied","Data":"1f5965e06a16c85ec33ae63d9b4ac5d21db3d09bd30a43060af87d9d6b8f48b5"} Jan 28 07:25:57 crc kubenswrapper[4776]: I0128 07:25:57.063884 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qj7gv" Jan 28 07:25:57 crc kubenswrapper[4776]: I0128 07:25:57.065429 4776 scope.go:117] "RemoveContainer" containerID="1f5965e06a16c85ec33ae63d9b4ac5d21db3d09bd30a43060af87d9d6b8f48b5" Jan 28 07:25:57 crc kubenswrapper[4776]: I0128 07:25:57.065330 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qj7gv" event={"ID":"987cc4b3-fee7-455a-b9b5-7a9cabc0b03f","Type":"ContainerDied","Data":"0e03f7504c0ec6a96386436a6c86d2adbc2a919c4ebdef5eba7d414232ac7778"} Jan 28 07:25:57 crc kubenswrapper[4776]: I0128 07:25:57.115260 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qj7gv"] Jan 28 07:25:57 crc kubenswrapper[4776]: I0128 07:25:57.121069 4776 scope.go:117] "RemoveContainer" containerID="204936a2dbff4b92329b266e1a188f6f280f9ffbdc701e812055252ea23b26de" Jan 28 07:25:57 crc kubenswrapper[4776]: I0128 07:25:57.130480 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qj7gv"] Jan 28 07:25:57 crc kubenswrapper[4776]: I0128 07:25:57.147755 4776 scope.go:117] "RemoveContainer" containerID="b7921bf95468b496aec712a006a32a53711411c2c49377d667704f6d06edf372" Jan 28 07:25:57 crc kubenswrapper[4776]: I0128 07:25:57.198016 4776 scope.go:117] "RemoveContainer" containerID="1f5965e06a16c85ec33ae63d9b4ac5d21db3d09bd30a43060af87d9d6b8f48b5" Jan 28 07:25:57 crc kubenswrapper[4776]: E0128 07:25:57.198528 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5965e06a16c85ec33ae63d9b4ac5d21db3d09bd30a43060af87d9d6b8f48b5\": container with ID starting with 1f5965e06a16c85ec33ae63d9b4ac5d21db3d09bd30a43060af87d9d6b8f48b5 not found: ID does not exist" containerID="1f5965e06a16c85ec33ae63d9b4ac5d21db3d09bd30a43060af87d9d6b8f48b5" Jan 28 07:25:57 crc kubenswrapper[4776]: I0128 07:25:57.198649 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5965e06a16c85ec33ae63d9b4ac5d21db3d09bd30a43060af87d9d6b8f48b5"} err="failed to get container status \"1f5965e06a16c85ec33ae63d9b4ac5d21db3d09bd30a43060af87d9d6b8f48b5\": rpc error: code = NotFound desc = could not find container \"1f5965e06a16c85ec33ae63d9b4ac5d21db3d09bd30a43060af87d9d6b8f48b5\": container with ID starting with 1f5965e06a16c85ec33ae63d9b4ac5d21db3d09bd30a43060af87d9d6b8f48b5 not found: ID does not exist" Jan 28 07:25:57 crc kubenswrapper[4776]: I0128 07:25:57.198886 4776 scope.go:117] "RemoveContainer" containerID="204936a2dbff4b92329b266e1a188f6f280f9ffbdc701e812055252ea23b26de" Jan 28 07:25:57 crc kubenswrapper[4776]: E0128 07:25:57.199308 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"204936a2dbff4b92329b266e1a188f6f280f9ffbdc701e812055252ea23b26de\": container with ID starting with 204936a2dbff4b92329b266e1a188f6f280f9ffbdc701e812055252ea23b26de not found: ID does not exist" containerID="204936a2dbff4b92329b266e1a188f6f280f9ffbdc701e812055252ea23b26de" Jan 28 07:25:57 crc kubenswrapper[4776]: I0128 07:25:57.199350 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"204936a2dbff4b92329b266e1a188f6f280f9ffbdc701e812055252ea23b26de"} err="failed to get container status \"204936a2dbff4b92329b266e1a188f6f280f9ffbdc701e812055252ea23b26de\": rpc error: code = NotFound desc = could not find container \"204936a2dbff4b92329b266e1a188f6f280f9ffbdc701e812055252ea23b26de\": container with ID starting with 204936a2dbff4b92329b266e1a188f6f280f9ffbdc701e812055252ea23b26de not found: ID does not exist" Jan 28 07:25:57 crc kubenswrapper[4776]: I0128 07:25:57.199382 4776 scope.go:117] "RemoveContainer" containerID="b7921bf95468b496aec712a006a32a53711411c2c49377d667704f6d06edf372" Jan 28 07:25:57 crc kubenswrapper[4776]: E0128 07:25:57.199914 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7921bf95468b496aec712a006a32a53711411c2c49377d667704f6d06edf372\": container with ID starting with b7921bf95468b496aec712a006a32a53711411c2c49377d667704f6d06edf372 not found: ID does not exist" containerID="b7921bf95468b496aec712a006a32a53711411c2c49377d667704f6d06edf372" Jan 28 07:25:57 crc kubenswrapper[4776]: I0128 07:25:57.199941 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7921bf95468b496aec712a006a32a53711411c2c49377d667704f6d06edf372"} err="failed to get container status \"b7921bf95468b496aec712a006a32a53711411c2c49377d667704f6d06edf372\": rpc error: code = NotFound desc = could not find container \"b7921bf95468b496aec712a006a32a53711411c2c49377d667704f6d06edf372\": container with ID starting with b7921bf95468b496aec712a006a32a53711411c2c49377d667704f6d06edf372 not found: ID does not exist" Jan 28 07:25:57 crc kubenswrapper[4776]: I0128 07:25:57.319810 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="987cc4b3-fee7-455a-b9b5-7a9cabc0b03f" path="/var/lib/kubelet/pods/987cc4b3-fee7-455a-b9b5-7a9cabc0b03f/volumes" Jan 28 07:26:03 crc kubenswrapper[4776]: I0128 07:26:03.852267 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:26:03 crc kubenswrapper[4776]: I0128 07:26:03.853044 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:26:33 crc kubenswrapper[4776]: I0128 07:26:33.852291 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:26:33 crc kubenswrapper[4776]: I0128 07:26:33.852908 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:27:02 crc kubenswrapper[4776]: I0128 07:27:02.776727 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qpgll"] Jan 28 07:27:02 crc kubenswrapper[4776]: E0128 07:27:02.778269 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987cc4b3-fee7-455a-b9b5-7a9cabc0b03f" containerName="extract-utilities" Jan 28 07:27:02 crc kubenswrapper[4776]: I0128 07:27:02.778289 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="987cc4b3-fee7-455a-b9b5-7a9cabc0b03f" containerName="extract-utilities" Jan 28 07:27:02 crc kubenswrapper[4776]: E0128 07:27:02.778311 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987cc4b3-fee7-455a-b9b5-7a9cabc0b03f" containerName="extract-content" Jan 28 07:27:02 crc kubenswrapper[4776]: I0128 07:27:02.778318 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="987cc4b3-fee7-455a-b9b5-7a9cabc0b03f" containerName="extract-content" Jan 28 07:27:02 crc kubenswrapper[4776]: E0128 07:27:02.778363 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987cc4b3-fee7-455a-b9b5-7a9cabc0b03f" containerName="registry-server" Jan 28 07:27:02 crc kubenswrapper[4776]: I0128 07:27:02.778371 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="987cc4b3-fee7-455a-b9b5-7a9cabc0b03f" containerName="registry-server" Jan 28 07:27:02 crc kubenswrapper[4776]: I0128 07:27:02.778719 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="987cc4b3-fee7-455a-b9b5-7a9cabc0b03f" containerName="registry-server" Jan 28 07:27:02 crc kubenswrapper[4776]: I0128 07:27:02.780941 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qpgll" Jan 28 07:27:02 crc kubenswrapper[4776]: I0128 07:27:02.793898 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qpgll"] Jan 28 07:27:02 crc kubenswrapper[4776]: I0128 07:27:02.842201 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr9nq\" (UniqueName: \"kubernetes.io/projected/33af1c34-6e66-4c15-9c2d-4398ce5016a5-kube-api-access-tr9nq\") pod \"redhat-operators-qpgll\" (UID: \"33af1c34-6e66-4c15-9c2d-4398ce5016a5\") " pod="openshift-marketplace/redhat-operators-qpgll" Jan 28 07:27:02 crc kubenswrapper[4776]: I0128 07:27:02.842298 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33af1c34-6e66-4c15-9c2d-4398ce5016a5-utilities\") pod \"redhat-operators-qpgll\" (UID: \"33af1c34-6e66-4c15-9c2d-4398ce5016a5\") " pod="openshift-marketplace/redhat-operators-qpgll" Jan 28 07:27:02 crc kubenswrapper[4776]: I0128 07:27:02.842359 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33af1c34-6e66-4c15-9c2d-4398ce5016a5-catalog-content\") pod \"redhat-operators-qpgll\" (UID: \"33af1c34-6e66-4c15-9c2d-4398ce5016a5\") " pod="openshift-marketplace/redhat-operators-qpgll" Jan 28 07:27:02 crc kubenswrapper[4776]: I0128 07:27:02.944214 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr9nq\" (UniqueName: \"kubernetes.io/projected/33af1c34-6e66-4c15-9c2d-4398ce5016a5-kube-api-access-tr9nq\") pod \"redhat-operators-qpgll\" (UID: \"33af1c34-6e66-4c15-9c2d-4398ce5016a5\") " pod="openshift-marketplace/redhat-operators-qpgll" Jan 28 07:27:02 crc kubenswrapper[4776]: I0128 07:27:02.944295 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33af1c34-6e66-4c15-9c2d-4398ce5016a5-utilities\") pod \"redhat-operators-qpgll\" (UID: \"33af1c34-6e66-4c15-9c2d-4398ce5016a5\") " pod="openshift-marketplace/redhat-operators-qpgll" Jan 28 07:27:02 crc kubenswrapper[4776]: I0128 07:27:02.944359 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33af1c34-6e66-4c15-9c2d-4398ce5016a5-catalog-content\") pod \"redhat-operators-qpgll\" (UID: \"33af1c34-6e66-4c15-9c2d-4398ce5016a5\") " pod="openshift-marketplace/redhat-operators-qpgll" Jan 28 07:27:02 crc kubenswrapper[4776]: I0128 07:27:02.944891 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33af1c34-6e66-4c15-9c2d-4398ce5016a5-utilities\") pod \"redhat-operators-qpgll\" (UID: \"33af1c34-6e66-4c15-9c2d-4398ce5016a5\") " pod="openshift-marketplace/redhat-operators-qpgll" Jan 28 07:27:02 crc kubenswrapper[4776]: I0128 07:27:02.944980 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33af1c34-6e66-4c15-9c2d-4398ce5016a5-catalog-content\") pod \"redhat-operators-qpgll\" (UID: \"33af1c34-6e66-4c15-9c2d-4398ce5016a5\") " pod="openshift-marketplace/redhat-operators-qpgll" Jan 28 07:27:02 crc kubenswrapper[4776]: I0128 07:27:02.964268 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr9nq\" (UniqueName: \"kubernetes.io/projected/33af1c34-6e66-4c15-9c2d-4398ce5016a5-kube-api-access-tr9nq\") pod \"redhat-operators-qpgll\" (UID: \"33af1c34-6e66-4c15-9c2d-4398ce5016a5\") " pod="openshift-marketplace/redhat-operators-qpgll" Jan 28 07:27:03 crc kubenswrapper[4776]: I0128 07:27:03.124974 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qpgll" Jan 28 07:27:03 crc kubenswrapper[4776]: I0128 07:27:03.503214 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qpgll"] Jan 28 07:27:03 crc kubenswrapper[4776]: I0128 07:27:03.819012 4776 generic.go:334] "Generic (PLEG): container finished" podID="33af1c34-6e66-4c15-9c2d-4398ce5016a5" containerID="dc8278dcfe796c352694a66412b3a259577714cc8d6e2e2e83f22eba3fda9e59" exitCode=0 Jan 28 07:27:03 crc kubenswrapper[4776]: I0128 07:27:03.819116 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpgll" event={"ID":"33af1c34-6e66-4c15-9c2d-4398ce5016a5","Type":"ContainerDied","Data":"dc8278dcfe796c352694a66412b3a259577714cc8d6e2e2e83f22eba3fda9e59"} Jan 28 07:27:03 crc kubenswrapper[4776]: I0128 07:27:03.819396 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpgll" event={"ID":"33af1c34-6e66-4c15-9c2d-4398ce5016a5","Type":"ContainerStarted","Data":"244d90eaa0e9aea1e2c4fa5ca755d7a2008222b3036ca92348b19e8a7d100477"} Jan 28 07:27:03 crc kubenswrapper[4776]: I0128 07:27:03.820744 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 07:27:03 crc kubenswrapper[4776]: I0128 07:27:03.852150 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:27:03 crc kubenswrapper[4776]: I0128 07:27:03.852213 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:27:03 crc kubenswrapper[4776]: I0128 07:27:03.852257 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 07:27:03 crc kubenswrapper[4776]: I0128 07:27:03.852962 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de7aa249c85a2b16467bbe0be5a794768c1a7b699134487f0a7e58ea7b6360b4"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 07:27:03 crc kubenswrapper[4776]: I0128 07:27:03.853023 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://de7aa249c85a2b16467bbe0be5a794768c1a7b699134487f0a7e58ea7b6360b4" gracePeriod=600 Jan 28 07:27:04 crc kubenswrapper[4776]: I0128 07:27:04.844290 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="de7aa249c85a2b16467bbe0be5a794768c1a7b699134487f0a7e58ea7b6360b4" exitCode=0 Jan 28 07:27:04 crc kubenswrapper[4776]: I0128 07:27:04.844402 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"de7aa249c85a2b16467bbe0be5a794768c1a7b699134487f0a7e58ea7b6360b4"} Jan 28 07:27:04 crc kubenswrapper[4776]: I0128 07:27:04.845020 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e"} Jan 28 07:27:04 crc kubenswrapper[4776]: I0128 07:27:04.845062 4776 scope.go:117] "RemoveContainer" containerID="30ec9b6180113f231d6fdd42ef934c0e0badbc881168c9bf96183d3185f6429b" Jan 28 07:27:05 crc kubenswrapper[4776]: I0128 07:27:05.869208 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpgll" event={"ID":"33af1c34-6e66-4c15-9c2d-4398ce5016a5","Type":"ContainerStarted","Data":"3a1ff1c2f09f2bd4f678acf0752ce94896c34d3bcef960533c54380633f9b028"} Jan 28 07:27:07 crc kubenswrapper[4776]: I0128 07:27:07.889802 4776 generic.go:334] "Generic (PLEG): container finished" podID="33af1c34-6e66-4c15-9c2d-4398ce5016a5" containerID="3a1ff1c2f09f2bd4f678acf0752ce94896c34d3bcef960533c54380633f9b028" exitCode=0 Jan 28 07:27:07 crc kubenswrapper[4776]: I0128 07:27:07.890006 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpgll" event={"ID":"33af1c34-6e66-4c15-9c2d-4398ce5016a5","Type":"ContainerDied","Data":"3a1ff1c2f09f2bd4f678acf0752ce94896c34d3bcef960533c54380633f9b028"} Jan 28 07:27:08 crc kubenswrapper[4776]: I0128 07:27:08.901974 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpgll" event={"ID":"33af1c34-6e66-4c15-9c2d-4398ce5016a5","Type":"ContainerStarted","Data":"78d15ad1e974fde7b6e0ee23596665e21770af84849d49a92633c687faf957e7"} Jan 28 07:27:08 crc kubenswrapper[4776]: I0128 07:27:08.920788 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qpgll" podStartSLOduration=2.448019415 podStartE2EDuration="6.92076951s" podCreationTimestamp="2026-01-28 07:27:02 +0000 UTC" firstStartedPulling="2026-01-28 07:27:03.820473764 +0000 UTC m=+2195.236133924" lastFinishedPulling="2026-01-28 07:27:08.293223859 +0000 UTC m=+2199.708884019" observedRunningTime="2026-01-28 07:27:08.920540294 +0000 UTC m=+2200.336200494" watchObservedRunningTime="2026-01-28 07:27:08.92076951 +0000 UTC m=+2200.336429670" Jan 28 07:27:13 crc kubenswrapper[4776]: I0128 07:27:13.125936 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qpgll" Jan 28 07:27:13 crc kubenswrapper[4776]: I0128 07:27:13.126585 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qpgll" Jan 28 07:27:14 crc kubenswrapper[4776]: I0128 07:27:14.203696 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qpgll" podUID="33af1c34-6e66-4c15-9c2d-4398ce5016a5" containerName="registry-server" probeResult="failure" output=< Jan 28 07:27:14 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Jan 28 07:27:14 crc kubenswrapper[4776]: > Jan 28 07:27:23 crc kubenswrapper[4776]: I0128 07:27:23.175423 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qpgll" Jan 28 07:27:23 crc kubenswrapper[4776]: I0128 07:27:23.223904 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qpgll" Jan 28 07:27:23 crc kubenswrapper[4776]: I0128 07:27:23.419856 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qpgll"] Jan 28 07:27:25 crc kubenswrapper[4776]: I0128 07:27:25.055321 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qpgll" podUID="33af1c34-6e66-4c15-9c2d-4398ce5016a5" containerName="registry-server" containerID="cri-o://78d15ad1e974fde7b6e0ee23596665e21770af84849d49a92633c687faf957e7" gracePeriod=2 Jan 28 07:27:25 crc kubenswrapper[4776]: I0128 07:27:25.551852 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qpgll" Jan 28 07:27:25 crc kubenswrapper[4776]: I0128 07:27:25.648761 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33af1c34-6e66-4c15-9c2d-4398ce5016a5-catalog-content\") pod \"33af1c34-6e66-4c15-9c2d-4398ce5016a5\" (UID: \"33af1c34-6e66-4c15-9c2d-4398ce5016a5\") " Jan 28 07:27:25 crc kubenswrapper[4776]: I0128 07:27:25.649159 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33af1c34-6e66-4c15-9c2d-4398ce5016a5-utilities\") pod \"33af1c34-6e66-4c15-9c2d-4398ce5016a5\" (UID: \"33af1c34-6e66-4c15-9c2d-4398ce5016a5\") " Jan 28 07:27:25 crc kubenswrapper[4776]: I0128 07:27:25.649249 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr9nq\" (UniqueName: \"kubernetes.io/projected/33af1c34-6e66-4c15-9c2d-4398ce5016a5-kube-api-access-tr9nq\") pod \"33af1c34-6e66-4c15-9c2d-4398ce5016a5\" (UID: \"33af1c34-6e66-4c15-9c2d-4398ce5016a5\") " Jan 28 07:27:25 crc kubenswrapper[4776]: I0128 07:27:25.650196 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33af1c34-6e66-4c15-9c2d-4398ce5016a5-utilities" (OuterVolumeSpecName: "utilities") pod "33af1c34-6e66-4c15-9c2d-4398ce5016a5" (UID: "33af1c34-6e66-4c15-9c2d-4398ce5016a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:27:25 crc kubenswrapper[4776]: I0128 07:27:25.658385 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33af1c34-6e66-4c15-9c2d-4398ce5016a5-kube-api-access-tr9nq" (OuterVolumeSpecName: "kube-api-access-tr9nq") pod "33af1c34-6e66-4c15-9c2d-4398ce5016a5" (UID: "33af1c34-6e66-4c15-9c2d-4398ce5016a5"). InnerVolumeSpecName "kube-api-access-tr9nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:27:25 crc kubenswrapper[4776]: I0128 07:27:25.752457 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33af1c34-6e66-4c15-9c2d-4398ce5016a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33af1c34-6e66-4c15-9c2d-4398ce5016a5" (UID: "33af1c34-6e66-4c15-9c2d-4398ce5016a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:27:25 crc kubenswrapper[4776]: I0128 07:27:25.752894 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33af1c34-6e66-4c15-9c2d-4398ce5016a5-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:27:25 crc kubenswrapper[4776]: I0128 07:27:25.752922 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr9nq\" (UniqueName: \"kubernetes.io/projected/33af1c34-6e66-4c15-9c2d-4398ce5016a5-kube-api-access-tr9nq\") on node \"crc\" DevicePath \"\"" Jan 28 07:27:25 crc kubenswrapper[4776]: I0128 07:27:25.854776 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33af1c34-6e66-4c15-9c2d-4398ce5016a5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:27:26 crc kubenswrapper[4776]: I0128 07:27:26.066784 4776 generic.go:334] "Generic (PLEG): container finished" podID="33af1c34-6e66-4c15-9c2d-4398ce5016a5" containerID="78d15ad1e974fde7b6e0ee23596665e21770af84849d49a92633c687faf957e7" exitCode=0 Jan 28 07:27:26 crc kubenswrapper[4776]: I0128 07:27:26.066845 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpgll" event={"ID":"33af1c34-6e66-4c15-9c2d-4398ce5016a5","Type":"ContainerDied","Data":"78d15ad1e974fde7b6e0ee23596665e21770af84849d49a92633c687faf957e7"} Jan 28 07:27:26 crc kubenswrapper[4776]: I0128 07:27:26.066876 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qpgll" Jan 28 07:27:26 crc kubenswrapper[4776]: I0128 07:27:26.066889 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpgll" event={"ID":"33af1c34-6e66-4c15-9c2d-4398ce5016a5","Type":"ContainerDied","Data":"244d90eaa0e9aea1e2c4fa5ca755d7a2008222b3036ca92348b19e8a7d100477"} Jan 28 07:27:26 crc kubenswrapper[4776]: I0128 07:27:26.066917 4776 scope.go:117] "RemoveContainer" containerID="78d15ad1e974fde7b6e0ee23596665e21770af84849d49a92633c687faf957e7" Jan 28 07:27:26 crc kubenswrapper[4776]: I0128 07:27:26.113371 4776 scope.go:117] "RemoveContainer" containerID="3a1ff1c2f09f2bd4f678acf0752ce94896c34d3bcef960533c54380633f9b028" Jan 28 07:27:26 crc kubenswrapper[4776]: I0128 07:27:26.123601 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qpgll"] Jan 28 07:27:26 crc kubenswrapper[4776]: I0128 07:27:26.131119 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qpgll"] Jan 28 07:27:26 crc kubenswrapper[4776]: I0128 07:27:26.143844 4776 scope.go:117] "RemoveContainer" containerID="dc8278dcfe796c352694a66412b3a259577714cc8d6e2e2e83f22eba3fda9e59" Jan 28 07:27:26 crc kubenswrapper[4776]: I0128 07:27:26.182514 4776 scope.go:117] "RemoveContainer" containerID="78d15ad1e974fde7b6e0ee23596665e21770af84849d49a92633c687faf957e7" Jan 28 07:27:26 crc kubenswrapper[4776]: E0128 07:27:26.182861 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78d15ad1e974fde7b6e0ee23596665e21770af84849d49a92633c687faf957e7\": container with ID starting with 78d15ad1e974fde7b6e0ee23596665e21770af84849d49a92633c687faf957e7 not found: ID does not exist" containerID="78d15ad1e974fde7b6e0ee23596665e21770af84849d49a92633c687faf957e7" Jan 28 07:27:26 crc kubenswrapper[4776]: I0128 07:27:26.182897 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d15ad1e974fde7b6e0ee23596665e21770af84849d49a92633c687faf957e7"} err="failed to get container status \"78d15ad1e974fde7b6e0ee23596665e21770af84849d49a92633c687faf957e7\": rpc error: code = NotFound desc = could not find container \"78d15ad1e974fde7b6e0ee23596665e21770af84849d49a92633c687faf957e7\": container with ID starting with 78d15ad1e974fde7b6e0ee23596665e21770af84849d49a92633c687faf957e7 not found: ID does not exist" Jan 28 07:27:26 crc kubenswrapper[4776]: I0128 07:27:26.182926 4776 scope.go:117] "RemoveContainer" containerID="3a1ff1c2f09f2bd4f678acf0752ce94896c34d3bcef960533c54380633f9b028" Jan 28 07:27:26 crc kubenswrapper[4776]: E0128 07:27:26.183133 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a1ff1c2f09f2bd4f678acf0752ce94896c34d3bcef960533c54380633f9b028\": container with ID starting with 3a1ff1c2f09f2bd4f678acf0752ce94896c34d3bcef960533c54380633f9b028 not found: ID does not exist" containerID="3a1ff1c2f09f2bd4f678acf0752ce94896c34d3bcef960533c54380633f9b028" Jan 28 07:27:26 crc kubenswrapper[4776]: I0128 07:27:26.183165 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1ff1c2f09f2bd4f678acf0752ce94896c34d3bcef960533c54380633f9b028"} err="failed to get container status \"3a1ff1c2f09f2bd4f678acf0752ce94896c34d3bcef960533c54380633f9b028\": rpc error: code = NotFound desc = could not find container \"3a1ff1c2f09f2bd4f678acf0752ce94896c34d3bcef960533c54380633f9b028\": container with ID starting with 3a1ff1c2f09f2bd4f678acf0752ce94896c34d3bcef960533c54380633f9b028 not found: ID does not exist" Jan 28 07:27:26 crc kubenswrapper[4776]: I0128 07:27:26.183182 4776 scope.go:117] "RemoveContainer" containerID="dc8278dcfe796c352694a66412b3a259577714cc8d6e2e2e83f22eba3fda9e59" Jan 28 07:27:26 crc kubenswrapper[4776]: E0128 07:27:26.183361 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc8278dcfe796c352694a66412b3a259577714cc8d6e2e2e83f22eba3fda9e59\": container with ID starting with dc8278dcfe796c352694a66412b3a259577714cc8d6e2e2e83f22eba3fda9e59 not found: ID does not exist" containerID="dc8278dcfe796c352694a66412b3a259577714cc8d6e2e2e83f22eba3fda9e59" Jan 28 07:27:26 crc kubenswrapper[4776]: I0128 07:27:26.183389 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc8278dcfe796c352694a66412b3a259577714cc8d6e2e2e83f22eba3fda9e59"} err="failed to get container status \"dc8278dcfe796c352694a66412b3a259577714cc8d6e2e2e83f22eba3fda9e59\": rpc error: code = NotFound desc = could not find container \"dc8278dcfe796c352694a66412b3a259577714cc8d6e2e2e83f22eba3fda9e59\": container with ID starting with dc8278dcfe796c352694a66412b3a259577714cc8d6e2e2e83f22eba3fda9e59 not found: ID does not exist" Jan 28 07:27:27 crc kubenswrapper[4776]: I0128 07:27:27.319615 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33af1c34-6e66-4c15-9c2d-4398ce5016a5" path="/var/lib/kubelet/pods/33af1c34-6e66-4c15-9c2d-4398ce5016a5/volumes" Jan 28 07:29:33 crc kubenswrapper[4776]: I0128 07:29:33.852097 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:29:33 crc kubenswrapper[4776]: I0128 07:29:33.852782 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.161053 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6"] Jan 28 07:30:00 crc kubenswrapper[4776]: E0128 07:30:00.161998 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33af1c34-6e66-4c15-9c2d-4398ce5016a5" containerName="registry-server" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.162012 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="33af1c34-6e66-4c15-9c2d-4398ce5016a5" containerName="registry-server" Jan 28 07:30:00 crc kubenswrapper[4776]: E0128 07:30:00.162041 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33af1c34-6e66-4c15-9c2d-4398ce5016a5" containerName="extract-content" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.162046 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="33af1c34-6e66-4c15-9c2d-4398ce5016a5" containerName="extract-content" Jan 28 07:30:00 crc kubenswrapper[4776]: E0128 07:30:00.162063 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33af1c34-6e66-4c15-9c2d-4398ce5016a5" containerName="extract-utilities" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.162070 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="33af1c34-6e66-4c15-9c2d-4398ce5016a5" containerName="extract-utilities" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.162241 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="33af1c34-6e66-4c15-9c2d-4398ce5016a5" containerName="registry-server" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.162931 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.165917 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.166655 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.176586 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6"] Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.262291 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lh6b\" (UniqueName: \"kubernetes.io/projected/19c50c04-4144-4b30-899a-c0ed5e61eb11-kube-api-access-2lh6b\") pod \"collect-profiles-29493090-zkzm6\" (UID: \"19c50c04-4144-4b30-899a-c0ed5e61eb11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.262433 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19c50c04-4144-4b30-899a-c0ed5e61eb11-secret-volume\") pod \"collect-profiles-29493090-zkzm6\" (UID: \"19c50c04-4144-4b30-899a-c0ed5e61eb11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.262469 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19c50c04-4144-4b30-899a-c0ed5e61eb11-config-volume\") pod \"collect-profiles-29493090-zkzm6\" (UID: \"19c50c04-4144-4b30-899a-c0ed5e61eb11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.364293 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lh6b\" (UniqueName: \"kubernetes.io/projected/19c50c04-4144-4b30-899a-c0ed5e61eb11-kube-api-access-2lh6b\") pod \"collect-profiles-29493090-zkzm6\" (UID: \"19c50c04-4144-4b30-899a-c0ed5e61eb11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.364484 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19c50c04-4144-4b30-899a-c0ed5e61eb11-secret-volume\") pod \"collect-profiles-29493090-zkzm6\" (UID: \"19c50c04-4144-4b30-899a-c0ed5e61eb11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.364505 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19c50c04-4144-4b30-899a-c0ed5e61eb11-config-volume\") pod \"collect-profiles-29493090-zkzm6\" (UID: \"19c50c04-4144-4b30-899a-c0ed5e61eb11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.365494 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19c50c04-4144-4b30-899a-c0ed5e61eb11-config-volume\") pod \"collect-profiles-29493090-zkzm6\" (UID: \"19c50c04-4144-4b30-899a-c0ed5e61eb11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.372016 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19c50c04-4144-4b30-899a-c0ed5e61eb11-secret-volume\") pod \"collect-profiles-29493090-zkzm6\" (UID: \"19c50c04-4144-4b30-899a-c0ed5e61eb11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.386961 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lh6b\" (UniqueName: \"kubernetes.io/projected/19c50c04-4144-4b30-899a-c0ed5e61eb11-kube-api-access-2lh6b\") pod \"collect-profiles-29493090-zkzm6\" (UID: \"19c50c04-4144-4b30-899a-c0ed5e61eb11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6" Jan 28 07:30:00 crc kubenswrapper[4776]: I0128 07:30:00.492094 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6" Jan 28 07:30:01 crc kubenswrapper[4776]: I0128 07:30:01.018086 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6"] Jan 28 07:30:01 crc kubenswrapper[4776]: I0128 07:30:01.823913 4776 generic.go:334] "Generic (PLEG): container finished" podID="19c50c04-4144-4b30-899a-c0ed5e61eb11" containerID="476fc56e33a0e03fc74a5babe6cfa95585e7cc3ac70af9eb6dd7aab2136ab12c" exitCode=0 Jan 28 07:30:01 crc kubenswrapper[4776]: I0128 07:30:01.824018 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6" event={"ID":"19c50c04-4144-4b30-899a-c0ed5e61eb11","Type":"ContainerDied","Data":"476fc56e33a0e03fc74a5babe6cfa95585e7cc3ac70af9eb6dd7aab2136ab12c"} Jan 28 07:30:01 crc kubenswrapper[4776]: I0128 07:30:01.824349 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6" event={"ID":"19c50c04-4144-4b30-899a-c0ed5e61eb11","Type":"ContainerStarted","Data":"6903a32d0cecca1f3013aec86453facff1027d226b1ca404c031f5ee2ecf5f73"} Jan 28 07:30:03 crc kubenswrapper[4776]: I0128 07:30:03.148102 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6" Jan 28 07:30:03 crc kubenswrapper[4776]: I0128 07:30:03.220319 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lh6b\" (UniqueName: \"kubernetes.io/projected/19c50c04-4144-4b30-899a-c0ed5e61eb11-kube-api-access-2lh6b\") pod \"19c50c04-4144-4b30-899a-c0ed5e61eb11\" (UID: \"19c50c04-4144-4b30-899a-c0ed5e61eb11\") " Jan 28 07:30:03 crc kubenswrapper[4776]: I0128 07:30:03.220431 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19c50c04-4144-4b30-899a-c0ed5e61eb11-config-volume\") pod \"19c50c04-4144-4b30-899a-c0ed5e61eb11\" (UID: \"19c50c04-4144-4b30-899a-c0ed5e61eb11\") " Jan 28 07:30:03 crc kubenswrapper[4776]: I0128 07:30:03.220502 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19c50c04-4144-4b30-899a-c0ed5e61eb11-secret-volume\") pod \"19c50c04-4144-4b30-899a-c0ed5e61eb11\" (UID: \"19c50c04-4144-4b30-899a-c0ed5e61eb11\") " Jan 28 07:30:03 crc kubenswrapper[4776]: I0128 07:30:03.221301 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19c50c04-4144-4b30-899a-c0ed5e61eb11-config-volume" (OuterVolumeSpecName: "config-volume") pod "19c50c04-4144-4b30-899a-c0ed5e61eb11" (UID: "19c50c04-4144-4b30-899a-c0ed5e61eb11"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:30:03 crc kubenswrapper[4776]: I0128 07:30:03.227679 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19c50c04-4144-4b30-899a-c0ed5e61eb11-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "19c50c04-4144-4b30-899a-c0ed5e61eb11" (UID: "19c50c04-4144-4b30-899a-c0ed5e61eb11"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:30:03 crc kubenswrapper[4776]: I0128 07:30:03.227821 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19c50c04-4144-4b30-899a-c0ed5e61eb11-kube-api-access-2lh6b" (OuterVolumeSpecName: "kube-api-access-2lh6b") pod "19c50c04-4144-4b30-899a-c0ed5e61eb11" (UID: "19c50c04-4144-4b30-899a-c0ed5e61eb11"). InnerVolumeSpecName "kube-api-access-2lh6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:30:03 crc kubenswrapper[4776]: I0128 07:30:03.324014 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19c50c04-4144-4b30-899a-c0ed5e61eb11-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 07:30:03 crc kubenswrapper[4776]: I0128 07:30:03.324073 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19c50c04-4144-4b30-899a-c0ed5e61eb11-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 07:30:03 crc kubenswrapper[4776]: I0128 07:30:03.324094 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lh6b\" (UniqueName: \"kubernetes.io/projected/19c50c04-4144-4b30-899a-c0ed5e61eb11-kube-api-access-2lh6b\") on node \"crc\" DevicePath \"\"" Jan 28 07:30:03 crc kubenswrapper[4776]: I0128 07:30:03.846136 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6" event={"ID":"19c50c04-4144-4b30-899a-c0ed5e61eb11","Type":"ContainerDied","Data":"6903a32d0cecca1f3013aec86453facff1027d226b1ca404c031f5ee2ecf5f73"} Jan 28 07:30:03 crc kubenswrapper[4776]: I0128 07:30:03.846179 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6903a32d0cecca1f3013aec86453facff1027d226b1ca404c031f5ee2ecf5f73" Jan 28 07:30:03 crc kubenswrapper[4776]: I0128 07:30:03.846241 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6" Jan 28 07:30:03 crc kubenswrapper[4776]: I0128 07:30:03.852386 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:30:03 crc kubenswrapper[4776]: I0128 07:30:03.852435 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:30:04 crc kubenswrapper[4776]: I0128 07:30:04.245293 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw"] Jan 28 07:30:04 crc kubenswrapper[4776]: I0128 07:30:04.256111 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493045-67htw"] Jan 28 07:30:05 crc kubenswrapper[4776]: I0128 07:30:05.328607 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784215d5-15f7-4ff3-b0b5-f176cc7b14b2" path="/var/lib/kubelet/pods/784215d5-15f7-4ff3-b0b5-f176cc7b14b2/volumes" Jan 28 07:30:25 crc kubenswrapper[4776]: I0128 07:30:25.097108 4776 generic.go:334] "Generic (PLEG): container finished" podID="5e450505-d924-4be0-8491-92297f012e24" containerID="345797b7485f49f5bf274f18a2f9230258b9773ca85b09c37ec68c3689a6de5b" exitCode=0 Jan 28 07:30:25 crc kubenswrapper[4776]: I0128 07:30:25.097218 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" event={"ID":"5e450505-d924-4be0-8491-92297f012e24","Type":"ContainerDied","Data":"345797b7485f49f5bf274f18a2f9230258b9773ca85b09c37ec68c3689a6de5b"} Jan 28 07:30:26 crc kubenswrapper[4776]: I0128 07:30:26.630405 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:30:26 crc kubenswrapper[4776]: I0128 07:30:26.748768 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-libvirt-combined-ca-bundle\") pod \"5e450505-d924-4be0-8491-92297f012e24\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " Jan 28 07:30:26 crc kubenswrapper[4776]: I0128 07:30:26.748826 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-inventory\") pod \"5e450505-d924-4be0-8491-92297f012e24\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " Jan 28 07:30:26 crc kubenswrapper[4776]: I0128 07:30:26.748942 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-ssh-key-openstack-edpm-ipam\") pod \"5e450505-d924-4be0-8491-92297f012e24\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " Jan 28 07:30:26 crc kubenswrapper[4776]: I0128 07:30:26.749104 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-libvirt-secret-0\") pod \"5e450505-d924-4be0-8491-92297f012e24\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " Jan 28 07:30:26 crc kubenswrapper[4776]: I0128 07:30:26.749163 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djnkk\" (UniqueName: \"kubernetes.io/projected/5e450505-d924-4be0-8491-92297f012e24-kube-api-access-djnkk\") pod \"5e450505-d924-4be0-8491-92297f012e24\" (UID: \"5e450505-d924-4be0-8491-92297f012e24\") " Jan 28 07:30:26 crc kubenswrapper[4776]: I0128 07:30:26.755518 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5e450505-d924-4be0-8491-92297f012e24" (UID: "5e450505-d924-4be0-8491-92297f012e24"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:30:26 crc kubenswrapper[4776]: I0128 07:30:26.757160 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e450505-d924-4be0-8491-92297f012e24-kube-api-access-djnkk" (OuterVolumeSpecName: "kube-api-access-djnkk") pod "5e450505-d924-4be0-8491-92297f012e24" (UID: "5e450505-d924-4be0-8491-92297f012e24"). InnerVolumeSpecName "kube-api-access-djnkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:30:26 crc kubenswrapper[4776]: I0128 07:30:26.779955 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "5e450505-d924-4be0-8491-92297f012e24" (UID: "5e450505-d924-4be0-8491-92297f012e24"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:30:26 crc kubenswrapper[4776]: I0128 07:30:26.785901 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5e450505-d924-4be0-8491-92297f012e24" (UID: "5e450505-d924-4be0-8491-92297f012e24"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:30:26 crc kubenswrapper[4776]: I0128 07:30:26.788669 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-inventory" (OuterVolumeSpecName: "inventory") pod "5e450505-d924-4be0-8491-92297f012e24" (UID: "5e450505-d924-4be0-8491-92297f012e24"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:30:26 crc kubenswrapper[4776]: I0128 07:30:26.851759 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:30:26 crc kubenswrapper[4776]: I0128 07:30:26.851791 4776 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:30:26 crc kubenswrapper[4776]: I0128 07:30:26.851802 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djnkk\" (UniqueName: \"kubernetes.io/projected/5e450505-d924-4be0-8491-92297f012e24-kube-api-access-djnkk\") on node \"crc\" DevicePath \"\"" Jan 28 07:30:26 crc kubenswrapper[4776]: I0128 07:30:26.851810 4776 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:30:26 crc kubenswrapper[4776]: I0128 07:30:26.851819 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e450505-d924-4be0-8491-92297f012e24-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.131715 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" event={"ID":"5e450505-d924-4be0-8491-92297f012e24","Type":"ContainerDied","Data":"d055b561edd05b71d34ab3946f7d8e39847c32ac29ecb699d9fdb0ffa23f459d"} Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.131754 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d055b561edd05b71d34ab3946f7d8e39847c32ac29ecb699d9fdb0ffa23f459d" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.131788 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-288pn" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.223091 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6"] Jan 28 07:30:27 crc kubenswrapper[4776]: E0128 07:30:27.223586 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e450505-d924-4be0-8491-92297f012e24" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.223607 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e450505-d924-4be0-8491-92297f012e24" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 28 07:30:27 crc kubenswrapper[4776]: E0128 07:30:27.223619 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19c50c04-4144-4b30-899a-c0ed5e61eb11" containerName="collect-profiles" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.223627 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="19c50c04-4144-4b30-899a-c0ed5e61eb11" containerName="collect-profiles" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.223851 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="19c50c04-4144-4b30-899a-c0ed5e61eb11" containerName="collect-profiles" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.223875 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e450505-d924-4be0-8491-92297f012e24" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.224675 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.227110 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.227129 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.227323 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.227677 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.227889 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.228015 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.228292 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cl6qn" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.244371 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6"] Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.369487 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.369737 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.369778 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.369821 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.369837 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.369855 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2qtw\" (UniqueName: \"kubernetes.io/projected/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-kube-api-access-v2qtw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.369887 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.369944 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.370014 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.471306 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.471376 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.471400 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.471418 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2qtw\" (UniqueName: \"kubernetes.io/projected/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-kube-api-access-v2qtw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.471455 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.471477 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.471573 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.471630 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.471650 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.472072 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.476282 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.475780 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.477072 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.477370 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.478049 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.478121 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.478886 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.498637 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2qtw\" (UniqueName: \"kubernetes.io/projected/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-kube-api-access-v2qtw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gtkx6\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:27 crc kubenswrapper[4776]: I0128 07:30:27.554755 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:30:28 crc kubenswrapper[4776]: I0128 07:30:28.189203 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6"] Jan 28 07:30:29 crc kubenswrapper[4776]: I0128 07:30:29.153118 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" event={"ID":"5f807cd7-856d-4fd5-afe4-963a4a77a5bf","Type":"ContainerStarted","Data":"75100459393aabefe03732cb6c38d503a6a5273e8470f760f17e22564f1a98f2"} Jan 28 07:30:29 crc kubenswrapper[4776]: I0128 07:30:29.153388 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" event={"ID":"5f807cd7-856d-4fd5-afe4-963a4a77a5bf","Type":"ContainerStarted","Data":"cd5c82611b85dc99c2128c3b2c67f98d12a3179eee90ca8435f915ca4808e2e9"} Jan 28 07:30:29 crc kubenswrapper[4776]: I0128 07:30:29.197924 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" podStartSLOduration=1.7365511310000001 podStartE2EDuration="2.19789987s" podCreationTimestamp="2026-01-28 07:30:27 +0000 UTC" firstStartedPulling="2026-01-28 07:30:28.188572846 +0000 UTC m=+2399.604233006" lastFinishedPulling="2026-01-28 07:30:28.649921575 +0000 UTC m=+2400.065581745" observedRunningTime="2026-01-28 07:30:29.189411541 +0000 UTC m=+2400.605071721" watchObservedRunningTime="2026-01-28 07:30:29.19789987 +0000 UTC m=+2400.613560050" Jan 28 07:30:33 crc kubenswrapper[4776]: I0128 07:30:33.851974 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:30:33 crc kubenswrapper[4776]: I0128 07:30:33.852670 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:30:33 crc kubenswrapper[4776]: I0128 07:30:33.852736 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 07:30:33 crc kubenswrapper[4776]: I0128 07:30:33.853937 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 07:30:33 crc kubenswrapper[4776]: I0128 07:30:33.854042 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" gracePeriod=600 Jan 28 07:30:34 crc kubenswrapper[4776]: I0128 07:30:34.202846 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" exitCode=0 Jan 28 07:30:34 crc kubenswrapper[4776]: I0128 07:30:34.202938 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e"} Jan 28 07:30:34 crc kubenswrapper[4776]: I0128 07:30:34.203276 4776 scope.go:117] "RemoveContainer" containerID="de7aa249c85a2b16467bbe0be5a794768c1a7b699134487f0a7e58ea7b6360b4" Jan 28 07:30:34 crc kubenswrapper[4776]: E0128 07:30:34.486423 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:30:35 crc kubenswrapper[4776]: I0128 07:30:35.215769 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:30:35 crc kubenswrapper[4776]: E0128 07:30:35.216299 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:30:47 crc kubenswrapper[4776]: I0128 07:30:47.304671 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:30:47 crc kubenswrapper[4776]: E0128 07:30:47.305578 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:30:48 crc kubenswrapper[4776]: I0128 07:30:48.933516 4776 scope.go:117] "RemoveContainer" containerID="cdfd9b87a3a8ef32183db88d792a1c130b5e9456aaec004ffebd8ff4ddbc0595" Jan 28 07:30:58 crc kubenswrapper[4776]: I0128 07:30:58.305429 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:30:58 crc kubenswrapper[4776]: E0128 07:30:58.306151 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:31:10 crc kubenswrapper[4776]: I0128 07:31:10.304454 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:31:10 crc kubenswrapper[4776]: E0128 07:31:10.305320 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:31:20 crc kubenswrapper[4776]: I0128 07:31:20.641908 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l9ttl"] Jan 28 07:31:20 crc kubenswrapper[4776]: I0128 07:31:20.645704 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l9ttl" Jan 28 07:31:20 crc kubenswrapper[4776]: I0128 07:31:20.668080 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l9ttl"] Jan 28 07:31:20 crc kubenswrapper[4776]: I0128 07:31:20.718004 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqphf\" (UniqueName: \"kubernetes.io/projected/7b881629-cd08-476e-a657-bc5eb8b3757c-kube-api-access-gqphf\") pod \"redhat-marketplace-l9ttl\" (UID: \"7b881629-cd08-476e-a657-bc5eb8b3757c\") " pod="openshift-marketplace/redhat-marketplace-l9ttl" Jan 28 07:31:20 crc kubenswrapper[4776]: I0128 07:31:20.718058 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b881629-cd08-476e-a657-bc5eb8b3757c-catalog-content\") pod \"redhat-marketplace-l9ttl\" (UID: \"7b881629-cd08-476e-a657-bc5eb8b3757c\") " pod="openshift-marketplace/redhat-marketplace-l9ttl" Jan 28 07:31:20 crc kubenswrapper[4776]: I0128 07:31:20.718120 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b881629-cd08-476e-a657-bc5eb8b3757c-utilities\") pod \"redhat-marketplace-l9ttl\" (UID: \"7b881629-cd08-476e-a657-bc5eb8b3757c\") " pod="openshift-marketplace/redhat-marketplace-l9ttl" Jan 28 07:31:20 crc kubenswrapper[4776]: I0128 07:31:20.819166 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b881629-cd08-476e-a657-bc5eb8b3757c-utilities\") pod \"redhat-marketplace-l9ttl\" (UID: \"7b881629-cd08-476e-a657-bc5eb8b3757c\") " pod="openshift-marketplace/redhat-marketplace-l9ttl" Jan 28 07:31:20 crc kubenswrapper[4776]: I0128 07:31:20.819335 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqphf\" (UniqueName: \"kubernetes.io/projected/7b881629-cd08-476e-a657-bc5eb8b3757c-kube-api-access-gqphf\") pod \"redhat-marketplace-l9ttl\" (UID: \"7b881629-cd08-476e-a657-bc5eb8b3757c\") " pod="openshift-marketplace/redhat-marketplace-l9ttl" Jan 28 07:31:20 crc kubenswrapper[4776]: I0128 07:31:20.819370 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b881629-cd08-476e-a657-bc5eb8b3757c-catalog-content\") pod \"redhat-marketplace-l9ttl\" (UID: \"7b881629-cd08-476e-a657-bc5eb8b3757c\") " pod="openshift-marketplace/redhat-marketplace-l9ttl" Jan 28 07:31:20 crc kubenswrapper[4776]: I0128 07:31:20.819720 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b881629-cd08-476e-a657-bc5eb8b3757c-utilities\") pod \"redhat-marketplace-l9ttl\" (UID: \"7b881629-cd08-476e-a657-bc5eb8b3757c\") " pod="openshift-marketplace/redhat-marketplace-l9ttl" Jan 28 07:31:20 crc kubenswrapper[4776]: I0128 07:31:20.819900 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b881629-cd08-476e-a657-bc5eb8b3757c-catalog-content\") pod \"redhat-marketplace-l9ttl\" (UID: \"7b881629-cd08-476e-a657-bc5eb8b3757c\") " pod="openshift-marketplace/redhat-marketplace-l9ttl" Jan 28 07:31:20 crc kubenswrapper[4776]: I0128 07:31:20.840229 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqphf\" (UniqueName: \"kubernetes.io/projected/7b881629-cd08-476e-a657-bc5eb8b3757c-kube-api-access-gqphf\") pod \"redhat-marketplace-l9ttl\" (UID: \"7b881629-cd08-476e-a657-bc5eb8b3757c\") " pod="openshift-marketplace/redhat-marketplace-l9ttl" Jan 28 07:31:20 crc kubenswrapper[4776]: I0128 07:31:20.984526 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l9ttl" Jan 28 07:31:21 crc kubenswrapper[4776]: I0128 07:31:21.473845 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l9ttl"] Jan 28 07:31:21 crc kubenswrapper[4776]: I0128 07:31:21.700746 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9ttl" event={"ID":"7b881629-cd08-476e-a657-bc5eb8b3757c","Type":"ContainerStarted","Data":"7bbb037ed3101850a004abb7d81efd8b8c9e3fd24c35f990479a39fa0eec4689"} Jan 28 07:31:22 crc kubenswrapper[4776]: I0128 07:31:22.304971 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:31:22 crc kubenswrapper[4776]: E0128 07:31:22.305594 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:31:22 crc kubenswrapper[4776]: I0128 07:31:22.714114 4776 generic.go:334] "Generic (PLEG): container finished" podID="7b881629-cd08-476e-a657-bc5eb8b3757c" containerID="0c8910d3596cc255e459d77d0e00414f7137ba1d39bbc7510815b2599e329979" exitCode=0 Jan 28 07:31:22 crc kubenswrapper[4776]: I0128 07:31:22.714195 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9ttl" event={"ID":"7b881629-cd08-476e-a657-bc5eb8b3757c","Type":"ContainerDied","Data":"0c8910d3596cc255e459d77d0e00414f7137ba1d39bbc7510815b2599e329979"} Jan 28 07:31:24 crc kubenswrapper[4776]: I0128 07:31:24.738726 4776 generic.go:334] "Generic (PLEG): container finished" podID="7b881629-cd08-476e-a657-bc5eb8b3757c" containerID="4583d6d5fa8bee1cc6b92a4f4996e470333c681f7a02cf09431d6fc091da2bdb" exitCode=0 Jan 28 07:31:24 crc kubenswrapper[4776]: I0128 07:31:24.738818 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9ttl" event={"ID":"7b881629-cd08-476e-a657-bc5eb8b3757c","Type":"ContainerDied","Data":"4583d6d5fa8bee1cc6b92a4f4996e470333c681f7a02cf09431d6fc091da2bdb"} Jan 28 07:31:25 crc kubenswrapper[4776]: I0128 07:31:25.751099 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9ttl" event={"ID":"7b881629-cd08-476e-a657-bc5eb8b3757c","Type":"ContainerStarted","Data":"ca121b5207ec24bc1d6448ba2abe09aaa0ac351d03e420543cf13ec053231f7d"} Jan 28 07:31:25 crc kubenswrapper[4776]: I0128 07:31:25.792859 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l9ttl" podStartSLOduration=3.229081792 podStartE2EDuration="5.792838115s" podCreationTimestamp="2026-01-28 07:31:20 +0000 UTC" firstStartedPulling="2026-01-28 07:31:22.716279556 +0000 UTC m=+2454.131939716" lastFinishedPulling="2026-01-28 07:31:25.280035869 +0000 UTC m=+2456.695696039" observedRunningTime="2026-01-28 07:31:25.7829799 +0000 UTC m=+2457.198640060" watchObservedRunningTime="2026-01-28 07:31:25.792838115 +0000 UTC m=+2457.208498275" Jan 28 07:31:30 crc kubenswrapper[4776]: I0128 07:31:30.984958 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l9ttl" Jan 28 07:31:30 crc kubenswrapper[4776]: I0128 07:31:30.985602 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l9ttl" Jan 28 07:31:31 crc kubenswrapper[4776]: I0128 07:31:31.044646 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l9ttl" Jan 28 07:31:31 crc kubenswrapper[4776]: I0128 07:31:31.931351 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l9ttl" Jan 28 07:31:31 crc kubenswrapper[4776]: I0128 07:31:31.993434 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l9ttl"] Jan 28 07:31:33 crc kubenswrapper[4776]: I0128 07:31:33.850955 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l9ttl" podUID="7b881629-cd08-476e-a657-bc5eb8b3757c" containerName="registry-server" containerID="cri-o://ca121b5207ec24bc1d6448ba2abe09aaa0ac351d03e420543cf13ec053231f7d" gracePeriod=2 Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.308463 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l9ttl" Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.424027 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b881629-cd08-476e-a657-bc5eb8b3757c-utilities\") pod \"7b881629-cd08-476e-a657-bc5eb8b3757c\" (UID: \"7b881629-cd08-476e-a657-bc5eb8b3757c\") " Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.424900 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b881629-cd08-476e-a657-bc5eb8b3757c-catalog-content\") pod \"7b881629-cd08-476e-a657-bc5eb8b3757c\" (UID: \"7b881629-cd08-476e-a657-bc5eb8b3757c\") " Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.424942 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b881629-cd08-476e-a657-bc5eb8b3757c-utilities" (OuterVolumeSpecName: "utilities") pod "7b881629-cd08-476e-a657-bc5eb8b3757c" (UID: "7b881629-cd08-476e-a657-bc5eb8b3757c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.425043 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqphf\" (UniqueName: \"kubernetes.io/projected/7b881629-cd08-476e-a657-bc5eb8b3757c-kube-api-access-gqphf\") pod \"7b881629-cd08-476e-a657-bc5eb8b3757c\" (UID: \"7b881629-cd08-476e-a657-bc5eb8b3757c\") " Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.426322 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b881629-cd08-476e-a657-bc5eb8b3757c-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.436864 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b881629-cd08-476e-a657-bc5eb8b3757c-kube-api-access-gqphf" (OuterVolumeSpecName: "kube-api-access-gqphf") pod "7b881629-cd08-476e-a657-bc5eb8b3757c" (UID: "7b881629-cd08-476e-a657-bc5eb8b3757c"). InnerVolumeSpecName "kube-api-access-gqphf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.446735 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b881629-cd08-476e-a657-bc5eb8b3757c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b881629-cd08-476e-a657-bc5eb8b3757c" (UID: "7b881629-cd08-476e-a657-bc5eb8b3757c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.529661 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b881629-cd08-476e-a657-bc5eb8b3757c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.529731 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqphf\" (UniqueName: \"kubernetes.io/projected/7b881629-cd08-476e-a657-bc5eb8b3757c-kube-api-access-gqphf\") on node \"crc\" DevicePath \"\"" Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.870915 4776 generic.go:334] "Generic (PLEG): container finished" podID="7b881629-cd08-476e-a657-bc5eb8b3757c" containerID="ca121b5207ec24bc1d6448ba2abe09aaa0ac351d03e420543cf13ec053231f7d" exitCode=0 Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.870984 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9ttl" event={"ID":"7b881629-cd08-476e-a657-bc5eb8b3757c","Type":"ContainerDied","Data":"ca121b5207ec24bc1d6448ba2abe09aaa0ac351d03e420543cf13ec053231f7d"} Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.871025 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l9ttl" Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.871059 4776 scope.go:117] "RemoveContainer" containerID="ca121b5207ec24bc1d6448ba2abe09aaa0ac351d03e420543cf13ec053231f7d" Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.871039 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9ttl" event={"ID":"7b881629-cd08-476e-a657-bc5eb8b3757c","Type":"ContainerDied","Data":"7bbb037ed3101850a004abb7d81efd8b8c9e3fd24c35f990479a39fa0eec4689"} Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.905860 4776 scope.go:117] "RemoveContainer" containerID="4583d6d5fa8bee1cc6b92a4f4996e470333c681f7a02cf09431d6fc091da2bdb" Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.933795 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l9ttl"] Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.946571 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l9ttl"] Jan 28 07:31:34 crc kubenswrapper[4776]: I0128 07:31:34.947966 4776 scope.go:117] "RemoveContainer" containerID="0c8910d3596cc255e459d77d0e00414f7137ba1d39bbc7510815b2599e329979" Jan 28 07:31:35 crc kubenswrapper[4776]: I0128 07:31:35.003872 4776 scope.go:117] "RemoveContainer" containerID="ca121b5207ec24bc1d6448ba2abe09aaa0ac351d03e420543cf13ec053231f7d" Jan 28 07:31:35 crc kubenswrapper[4776]: E0128 07:31:35.004521 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca121b5207ec24bc1d6448ba2abe09aaa0ac351d03e420543cf13ec053231f7d\": container with ID starting with ca121b5207ec24bc1d6448ba2abe09aaa0ac351d03e420543cf13ec053231f7d not found: ID does not exist" containerID="ca121b5207ec24bc1d6448ba2abe09aaa0ac351d03e420543cf13ec053231f7d" Jan 28 07:31:35 crc kubenswrapper[4776]: I0128 07:31:35.004584 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca121b5207ec24bc1d6448ba2abe09aaa0ac351d03e420543cf13ec053231f7d"} err="failed to get container status \"ca121b5207ec24bc1d6448ba2abe09aaa0ac351d03e420543cf13ec053231f7d\": rpc error: code = NotFound desc = could not find container \"ca121b5207ec24bc1d6448ba2abe09aaa0ac351d03e420543cf13ec053231f7d\": container with ID starting with ca121b5207ec24bc1d6448ba2abe09aaa0ac351d03e420543cf13ec053231f7d not found: ID does not exist" Jan 28 07:31:35 crc kubenswrapper[4776]: I0128 07:31:35.004615 4776 scope.go:117] "RemoveContainer" containerID="4583d6d5fa8bee1cc6b92a4f4996e470333c681f7a02cf09431d6fc091da2bdb" Jan 28 07:31:35 crc kubenswrapper[4776]: E0128 07:31:35.005111 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4583d6d5fa8bee1cc6b92a4f4996e470333c681f7a02cf09431d6fc091da2bdb\": container with ID starting with 4583d6d5fa8bee1cc6b92a4f4996e470333c681f7a02cf09431d6fc091da2bdb not found: ID does not exist" containerID="4583d6d5fa8bee1cc6b92a4f4996e470333c681f7a02cf09431d6fc091da2bdb" Jan 28 07:31:35 crc kubenswrapper[4776]: I0128 07:31:35.005176 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4583d6d5fa8bee1cc6b92a4f4996e470333c681f7a02cf09431d6fc091da2bdb"} err="failed to get container status \"4583d6d5fa8bee1cc6b92a4f4996e470333c681f7a02cf09431d6fc091da2bdb\": rpc error: code = NotFound desc = could not find container \"4583d6d5fa8bee1cc6b92a4f4996e470333c681f7a02cf09431d6fc091da2bdb\": container with ID starting with 4583d6d5fa8bee1cc6b92a4f4996e470333c681f7a02cf09431d6fc091da2bdb not found: ID does not exist" Jan 28 07:31:35 crc kubenswrapper[4776]: I0128 07:31:35.005216 4776 scope.go:117] "RemoveContainer" containerID="0c8910d3596cc255e459d77d0e00414f7137ba1d39bbc7510815b2599e329979" Jan 28 07:31:35 crc kubenswrapper[4776]: E0128 07:31:35.005708 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c8910d3596cc255e459d77d0e00414f7137ba1d39bbc7510815b2599e329979\": container with ID starting with 0c8910d3596cc255e459d77d0e00414f7137ba1d39bbc7510815b2599e329979 not found: ID does not exist" containerID="0c8910d3596cc255e459d77d0e00414f7137ba1d39bbc7510815b2599e329979" Jan 28 07:31:35 crc kubenswrapper[4776]: I0128 07:31:35.005757 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8910d3596cc255e459d77d0e00414f7137ba1d39bbc7510815b2599e329979"} err="failed to get container status \"0c8910d3596cc255e459d77d0e00414f7137ba1d39bbc7510815b2599e329979\": rpc error: code = NotFound desc = could not find container \"0c8910d3596cc255e459d77d0e00414f7137ba1d39bbc7510815b2599e329979\": container with ID starting with 0c8910d3596cc255e459d77d0e00414f7137ba1d39bbc7510815b2599e329979 not found: ID does not exist" Jan 28 07:31:35 crc kubenswrapper[4776]: I0128 07:31:35.327277 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b881629-cd08-476e-a657-bc5eb8b3757c" path="/var/lib/kubelet/pods/7b881629-cd08-476e-a657-bc5eb8b3757c/volumes" Jan 28 07:31:37 crc kubenswrapper[4776]: I0128 07:31:37.305606 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:31:37 crc kubenswrapper[4776]: E0128 07:31:37.306525 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:31:51 crc kubenswrapper[4776]: I0128 07:31:51.305945 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:31:51 crc kubenswrapper[4776]: E0128 07:31:51.307116 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:32:03 crc kubenswrapper[4776]: I0128 07:32:03.307107 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:32:03 crc kubenswrapper[4776]: E0128 07:32:03.308131 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:32:16 crc kubenswrapper[4776]: I0128 07:32:16.305905 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:32:16 crc kubenswrapper[4776]: E0128 07:32:16.307140 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:32:30 crc kubenswrapper[4776]: I0128 07:32:30.305970 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:32:30 crc kubenswrapper[4776]: E0128 07:32:30.307886 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:32:42 crc kubenswrapper[4776]: I0128 07:32:42.305296 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:32:42 crc kubenswrapper[4776]: E0128 07:32:42.306436 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:32:57 crc kubenswrapper[4776]: I0128 07:32:57.304927 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:32:57 crc kubenswrapper[4776]: E0128 07:32:57.306065 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:33:08 crc kubenswrapper[4776]: I0128 07:33:08.304655 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:33:08 crc kubenswrapper[4776]: E0128 07:33:08.305651 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:33:19 crc kubenswrapper[4776]: I0128 07:33:19.313797 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:33:19 crc kubenswrapper[4776]: E0128 07:33:19.315173 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:33:19 crc kubenswrapper[4776]: I0128 07:33:19.998070 4776 generic.go:334] "Generic (PLEG): container finished" podID="5f807cd7-856d-4fd5-afe4-963a4a77a5bf" containerID="75100459393aabefe03732cb6c38d503a6a5273e8470f760f17e22564f1a98f2" exitCode=0 Jan 28 07:33:19 crc kubenswrapper[4776]: I0128 07:33:19.998110 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" event={"ID":"5f807cd7-856d-4fd5-afe4-963a4a77a5bf","Type":"ContainerDied","Data":"75100459393aabefe03732cb6c38d503a6a5273e8470f760f17e22564f1a98f2"} Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.485671 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.636302 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-combined-ca-bundle\") pod \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.636341 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-extra-config-0\") pod \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.636378 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-ssh-key-openstack-edpm-ipam\") pod \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.636486 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-cell1-compute-config-1\") pod \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.636593 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2qtw\" (UniqueName: \"kubernetes.io/projected/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-kube-api-access-v2qtw\") pod \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.636630 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-cell1-compute-config-0\") pod \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.636673 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-migration-ssh-key-0\") pod \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.636742 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-inventory\") pod \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.636806 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-migration-ssh-key-1\") pod \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\" (UID: \"5f807cd7-856d-4fd5-afe4-963a4a77a5bf\") " Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.642074 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-kube-api-access-v2qtw" (OuterVolumeSpecName: "kube-api-access-v2qtw") pod "5f807cd7-856d-4fd5-afe4-963a4a77a5bf" (UID: "5f807cd7-856d-4fd5-afe4-963a4a77a5bf"). InnerVolumeSpecName "kube-api-access-v2qtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.644331 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "5f807cd7-856d-4fd5-afe4-963a4a77a5bf" (UID: "5f807cd7-856d-4fd5-afe4-963a4a77a5bf"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.669399 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5f807cd7-856d-4fd5-afe4-963a4a77a5bf" (UID: "5f807cd7-856d-4fd5-afe4-963a4a77a5bf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.670096 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "5f807cd7-856d-4fd5-afe4-963a4a77a5bf" (UID: "5f807cd7-856d-4fd5-afe4-963a4a77a5bf"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.672041 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "5f807cd7-856d-4fd5-afe4-963a4a77a5bf" (UID: "5f807cd7-856d-4fd5-afe4-963a4a77a5bf"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.673199 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "5f807cd7-856d-4fd5-afe4-963a4a77a5bf" (UID: "5f807cd7-856d-4fd5-afe4-963a4a77a5bf"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.683916 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "5f807cd7-856d-4fd5-afe4-963a4a77a5bf" (UID: "5f807cd7-856d-4fd5-afe4-963a4a77a5bf"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.690103 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-inventory" (OuterVolumeSpecName: "inventory") pod "5f807cd7-856d-4fd5-afe4-963a4a77a5bf" (UID: "5f807cd7-856d-4fd5-afe4-963a4a77a5bf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.701358 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "5f807cd7-856d-4fd5-afe4-963a4a77a5bf" (UID: "5f807cd7-856d-4fd5-afe4-963a4a77a5bf"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.739154 4776 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.739190 4776 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.739201 4776 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.739210 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.739220 4776 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.739229 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2qtw\" (UniqueName: \"kubernetes.io/projected/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-kube-api-access-v2qtw\") on node \"crc\" DevicePath \"\"" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.739240 4776 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.739250 4776 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:33:21 crc kubenswrapper[4776]: I0128 07:33:21.739259 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f807cd7-856d-4fd5-afe4-963a4a77a5bf-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.021176 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" event={"ID":"5f807cd7-856d-4fd5-afe4-963a4a77a5bf","Type":"ContainerDied","Data":"cd5c82611b85dc99c2128c3b2c67f98d12a3179eee90ca8435f915ca4808e2e9"} Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.021250 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd5c82611b85dc99c2128c3b2c67f98d12a3179eee90ca8435f915ca4808e2e9" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.021280 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gtkx6" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.135497 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m"] Jan 28 07:33:22 crc kubenswrapper[4776]: E0128 07:33:22.135892 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b881629-cd08-476e-a657-bc5eb8b3757c" containerName="extract-utilities" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.135912 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b881629-cd08-476e-a657-bc5eb8b3757c" containerName="extract-utilities" Jan 28 07:33:22 crc kubenswrapper[4776]: E0128 07:33:22.135932 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b881629-cd08-476e-a657-bc5eb8b3757c" containerName="registry-server" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.135939 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b881629-cd08-476e-a657-bc5eb8b3757c" containerName="registry-server" Jan 28 07:33:22 crc kubenswrapper[4776]: E0128 07:33:22.135962 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f807cd7-856d-4fd5-afe4-963a4a77a5bf" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.135969 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f807cd7-856d-4fd5-afe4-963a4a77a5bf" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 28 07:33:22 crc kubenswrapper[4776]: E0128 07:33:22.135985 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b881629-cd08-476e-a657-bc5eb8b3757c" containerName="extract-content" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.135991 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b881629-cd08-476e-a657-bc5eb8b3757c" containerName="extract-content" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.136152 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f807cd7-856d-4fd5-afe4-963a4a77a5bf" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.136179 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b881629-cd08-476e-a657-bc5eb8b3757c" containerName="registry-server" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.137208 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.139359 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.139776 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.139838 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.143890 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.144053 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cl6qn" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.154468 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m"] Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.252677 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.252741 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.252807 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.252978 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.253036 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.253208 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.253422 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzzpq\" (UniqueName: \"kubernetes.io/projected/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-kube-api-access-nzzpq\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.356529 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzzpq\" (UniqueName: \"kubernetes.io/projected/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-kube-api-access-nzzpq\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.358045 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.358295 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.358527 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.358852 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.359112 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.359443 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.361798 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.363373 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.363686 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.364636 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.364671 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.368196 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.379621 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzzpq\" (UniqueName: \"kubernetes.io/projected/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-kube-api-access-nzzpq\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fp67m\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:22 crc kubenswrapper[4776]: I0128 07:33:22.453921 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:33:23 crc kubenswrapper[4776]: I0128 07:33:23.010090 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m"] Jan 28 07:33:23 crc kubenswrapper[4776]: I0128 07:33:23.010154 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 07:33:23 crc kubenswrapper[4776]: I0128 07:33:23.039382 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" event={"ID":"1dde4f71-00f6-46fa-b16c-429edb9ee1ce","Type":"ContainerStarted","Data":"0795c4d2206803c0e86ddf4ef45051b83b196e14096491828f19198f008d08ad"} Jan 28 07:33:24 crc kubenswrapper[4776]: I0128 07:33:24.051715 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" event={"ID":"1dde4f71-00f6-46fa-b16c-429edb9ee1ce","Type":"ContainerStarted","Data":"b9620620d9d5cbe390d4c0c45dfce60343bc840d6c944462dca7ee560162b669"} Jan 28 07:33:24 crc kubenswrapper[4776]: I0128 07:33:24.086914 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" podStartSLOduration=1.5938761989999999 podStartE2EDuration="2.086881163s" podCreationTimestamp="2026-01-28 07:33:22 +0000 UTC" firstStartedPulling="2026-01-28 07:33:23.00981661 +0000 UTC m=+2574.425476780" lastFinishedPulling="2026-01-28 07:33:23.502821544 +0000 UTC m=+2574.918481744" observedRunningTime="2026-01-28 07:33:24.077823089 +0000 UTC m=+2575.493483289" watchObservedRunningTime="2026-01-28 07:33:24.086881163 +0000 UTC m=+2575.502541373" Jan 28 07:33:33 crc kubenswrapper[4776]: I0128 07:33:33.304501 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:33:33 crc kubenswrapper[4776]: E0128 07:33:33.305423 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:33:48 crc kubenswrapper[4776]: I0128 07:33:48.304976 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:33:48 crc kubenswrapper[4776]: E0128 07:33:48.305687 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:34:02 crc kubenswrapper[4776]: I0128 07:34:02.305441 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:34:02 crc kubenswrapper[4776]: E0128 07:34:02.306419 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:34:17 crc kubenswrapper[4776]: I0128 07:34:17.305186 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:34:17 crc kubenswrapper[4776]: E0128 07:34:17.305940 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:34:29 crc kubenswrapper[4776]: I0128 07:34:29.305046 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:34:29 crc kubenswrapper[4776]: E0128 07:34:29.306430 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:34:41 crc kubenswrapper[4776]: I0128 07:34:41.304819 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:34:41 crc kubenswrapper[4776]: E0128 07:34:41.312520 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:34:47 crc kubenswrapper[4776]: I0128 07:34:47.387980 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xgkpj"] Jan 28 07:34:47 crc kubenswrapper[4776]: I0128 07:34:47.398231 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgkpj" Jan 28 07:34:47 crc kubenswrapper[4776]: I0128 07:34:47.410901 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xgkpj"] Jan 28 07:34:47 crc kubenswrapper[4776]: I0128 07:34:47.479641 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2896ed2f-0fb3-419e-9338-247138421520-utilities\") pod \"community-operators-xgkpj\" (UID: \"2896ed2f-0fb3-419e-9338-247138421520\") " pod="openshift-marketplace/community-operators-xgkpj" Jan 28 07:34:47 crc kubenswrapper[4776]: I0128 07:34:47.479698 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9bfz\" (UniqueName: \"kubernetes.io/projected/2896ed2f-0fb3-419e-9338-247138421520-kube-api-access-g9bfz\") pod \"community-operators-xgkpj\" (UID: \"2896ed2f-0fb3-419e-9338-247138421520\") " pod="openshift-marketplace/community-operators-xgkpj" Jan 28 07:34:47 crc kubenswrapper[4776]: I0128 07:34:47.479720 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2896ed2f-0fb3-419e-9338-247138421520-catalog-content\") pod \"community-operators-xgkpj\" (UID: \"2896ed2f-0fb3-419e-9338-247138421520\") " pod="openshift-marketplace/community-operators-xgkpj" Jan 28 07:34:47 crc kubenswrapper[4776]: I0128 07:34:47.582048 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2896ed2f-0fb3-419e-9338-247138421520-utilities\") pod \"community-operators-xgkpj\" (UID: \"2896ed2f-0fb3-419e-9338-247138421520\") " pod="openshift-marketplace/community-operators-xgkpj" Jan 28 07:34:47 crc kubenswrapper[4776]: I0128 07:34:47.582120 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9bfz\" (UniqueName: \"kubernetes.io/projected/2896ed2f-0fb3-419e-9338-247138421520-kube-api-access-g9bfz\") pod \"community-operators-xgkpj\" (UID: \"2896ed2f-0fb3-419e-9338-247138421520\") " pod="openshift-marketplace/community-operators-xgkpj" Jan 28 07:34:47 crc kubenswrapper[4776]: I0128 07:34:47.582148 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2896ed2f-0fb3-419e-9338-247138421520-catalog-content\") pod \"community-operators-xgkpj\" (UID: \"2896ed2f-0fb3-419e-9338-247138421520\") " pod="openshift-marketplace/community-operators-xgkpj" Jan 28 07:34:47 crc kubenswrapper[4776]: I0128 07:34:47.582529 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2896ed2f-0fb3-419e-9338-247138421520-utilities\") pod \"community-operators-xgkpj\" (UID: \"2896ed2f-0fb3-419e-9338-247138421520\") " pod="openshift-marketplace/community-operators-xgkpj" Jan 28 07:34:47 crc kubenswrapper[4776]: I0128 07:34:47.582672 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2896ed2f-0fb3-419e-9338-247138421520-catalog-content\") pod \"community-operators-xgkpj\" (UID: \"2896ed2f-0fb3-419e-9338-247138421520\") " pod="openshift-marketplace/community-operators-xgkpj" Jan 28 07:34:47 crc kubenswrapper[4776]: I0128 07:34:47.602646 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9bfz\" (UniqueName: \"kubernetes.io/projected/2896ed2f-0fb3-419e-9338-247138421520-kube-api-access-g9bfz\") pod \"community-operators-xgkpj\" (UID: \"2896ed2f-0fb3-419e-9338-247138421520\") " pod="openshift-marketplace/community-operators-xgkpj" Jan 28 07:34:47 crc kubenswrapper[4776]: I0128 07:34:47.732621 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgkpj" Jan 28 07:34:48 crc kubenswrapper[4776]: I0128 07:34:48.217022 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xgkpj"] Jan 28 07:34:48 crc kubenswrapper[4776]: I0128 07:34:48.917215 4776 generic.go:334] "Generic (PLEG): container finished" podID="2896ed2f-0fb3-419e-9338-247138421520" containerID="8cbb4ce304695d965224a1a5e3329eedfaade58a43f84bbf9ea357967dfe4222" exitCode=0 Jan 28 07:34:48 crc kubenswrapper[4776]: I0128 07:34:48.917301 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgkpj" event={"ID":"2896ed2f-0fb3-419e-9338-247138421520","Type":"ContainerDied","Data":"8cbb4ce304695d965224a1a5e3329eedfaade58a43f84bbf9ea357967dfe4222"} Jan 28 07:34:48 crc kubenswrapper[4776]: I0128 07:34:48.917608 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgkpj" event={"ID":"2896ed2f-0fb3-419e-9338-247138421520","Type":"ContainerStarted","Data":"582a850ede99815b8ced4af998739a552c6ccfe6f8e281cb67403c5fdc2507ca"} Jan 28 07:34:49 crc kubenswrapper[4776]: I0128 07:34:49.929961 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgkpj" event={"ID":"2896ed2f-0fb3-419e-9338-247138421520","Type":"ContainerStarted","Data":"21c2c92854ff801a5127d163d51b28bac3b1f94daff02fcb1e60619f1edad8f7"} Jan 28 07:34:50 crc kubenswrapper[4776]: I0128 07:34:50.943564 4776 generic.go:334] "Generic (PLEG): container finished" podID="2896ed2f-0fb3-419e-9338-247138421520" containerID="21c2c92854ff801a5127d163d51b28bac3b1f94daff02fcb1e60619f1edad8f7" exitCode=0 Jan 28 07:34:50 crc kubenswrapper[4776]: I0128 07:34:50.943613 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgkpj" event={"ID":"2896ed2f-0fb3-419e-9338-247138421520","Type":"ContainerDied","Data":"21c2c92854ff801a5127d163d51b28bac3b1f94daff02fcb1e60619f1edad8f7"} Jan 28 07:34:51 crc kubenswrapper[4776]: I0128 07:34:51.955287 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgkpj" event={"ID":"2896ed2f-0fb3-419e-9338-247138421520","Type":"ContainerStarted","Data":"941313d073f89167b3d8861e6897efa95e1c76154f7a8b6b0409b4dff8022313"} Jan 28 07:34:51 crc kubenswrapper[4776]: I0128 07:34:51.984131 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xgkpj" podStartSLOduration=2.452304507 podStartE2EDuration="4.984111198s" podCreationTimestamp="2026-01-28 07:34:47 +0000 UTC" firstStartedPulling="2026-01-28 07:34:48.920423726 +0000 UTC m=+2660.336083916" lastFinishedPulling="2026-01-28 07:34:51.452230447 +0000 UTC m=+2662.867890607" observedRunningTime="2026-01-28 07:34:51.977137641 +0000 UTC m=+2663.392797801" watchObservedRunningTime="2026-01-28 07:34:51.984111198 +0000 UTC m=+2663.399771358" Jan 28 07:34:54 crc kubenswrapper[4776]: I0128 07:34:54.304978 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:34:54 crc kubenswrapper[4776]: E0128 07:34:54.305490 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:34:57 crc kubenswrapper[4776]: I0128 07:34:57.732945 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xgkpj" Jan 28 07:34:57 crc kubenswrapper[4776]: I0128 07:34:57.733623 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xgkpj" Jan 28 07:34:57 crc kubenswrapper[4776]: I0128 07:34:57.818362 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xgkpj" Jan 28 07:34:58 crc kubenswrapper[4776]: I0128 07:34:58.098309 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xgkpj" Jan 28 07:34:58 crc kubenswrapper[4776]: I0128 07:34:58.161905 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xgkpj"] Jan 28 07:35:00 crc kubenswrapper[4776]: I0128 07:35:00.062976 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xgkpj" podUID="2896ed2f-0fb3-419e-9338-247138421520" containerName="registry-server" containerID="cri-o://941313d073f89167b3d8861e6897efa95e1c76154f7a8b6b0409b4dff8022313" gracePeriod=2 Jan 28 07:35:00 crc kubenswrapper[4776]: I0128 07:35:00.592088 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgkpj" Jan 28 07:35:00 crc kubenswrapper[4776]: I0128 07:35:00.654965 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2896ed2f-0fb3-419e-9338-247138421520-utilities\") pod \"2896ed2f-0fb3-419e-9338-247138421520\" (UID: \"2896ed2f-0fb3-419e-9338-247138421520\") " Jan 28 07:35:00 crc kubenswrapper[4776]: I0128 07:35:00.655205 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2896ed2f-0fb3-419e-9338-247138421520-catalog-content\") pod \"2896ed2f-0fb3-419e-9338-247138421520\" (UID: \"2896ed2f-0fb3-419e-9338-247138421520\") " Jan 28 07:35:00 crc kubenswrapper[4776]: I0128 07:35:00.655251 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9bfz\" (UniqueName: \"kubernetes.io/projected/2896ed2f-0fb3-419e-9338-247138421520-kube-api-access-g9bfz\") pod \"2896ed2f-0fb3-419e-9338-247138421520\" (UID: \"2896ed2f-0fb3-419e-9338-247138421520\") " Jan 28 07:35:00 crc kubenswrapper[4776]: I0128 07:35:00.656095 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2896ed2f-0fb3-419e-9338-247138421520-utilities" (OuterVolumeSpecName: "utilities") pod "2896ed2f-0fb3-419e-9338-247138421520" (UID: "2896ed2f-0fb3-419e-9338-247138421520"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:35:00 crc kubenswrapper[4776]: I0128 07:35:00.660841 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2896ed2f-0fb3-419e-9338-247138421520-kube-api-access-g9bfz" (OuterVolumeSpecName: "kube-api-access-g9bfz") pod "2896ed2f-0fb3-419e-9338-247138421520" (UID: "2896ed2f-0fb3-419e-9338-247138421520"). InnerVolumeSpecName "kube-api-access-g9bfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:35:00 crc kubenswrapper[4776]: I0128 07:35:00.758664 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9bfz\" (UniqueName: \"kubernetes.io/projected/2896ed2f-0fb3-419e-9338-247138421520-kube-api-access-g9bfz\") on node \"crc\" DevicePath \"\"" Jan 28 07:35:00 crc kubenswrapper[4776]: I0128 07:35:00.758715 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2896ed2f-0fb3-419e-9338-247138421520-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:35:01 crc kubenswrapper[4776]: I0128 07:35:01.074219 4776 generic.go:334] "Generic (PLEG): container finished" podID="2896ed2f-0fb3-419e-9338-247138421520" containerID="941313d073f89167b3d8861e6897efa95e1c76154f7a8b6b0409b4dff8022313" exitCode=0 Jan 28 07:35:01 crc kubenswrapper[4776]: I0128 07:35:01.074272 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgkpj" event={"ID":"2896ed2f-0fb3-419e-9338-247138421520","Type":"ContainerDied","Data":"941313d073f89167b3d8861e6897efa95e1c76154f7a8b6b0409b4dff8022313"} Jan 28 07:35:01 crc kubenswrapper[4776]: I0128 07:35:01.074275 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgkpj" Jan 28 07:35:01 crc kubenswrapper[4776]: I0128 07:35:01.074307 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgkpj" event={"ID":"2896ed2f-0fb3-419e-9338-247138421520","Type":"ContainerDied","Data":"582a850ede99815b8ced4af998739a552c6ccfe6f8e281cb67403c5fdc2507ca"} Jan 28 07:35:01 crc kubenswrapper[4776]: I0128 07:35:01.074329 4776 scope.go:117] "RemoveContainer" containerID="941313d073f89167b3d8861e6897efa95e1c76154f7a8b6b0409b4dff8022313" Jan 28 07:35:01 crc kubenswrapper[4776]: I0128 07:35:01.098153 4776 scope.go:117] "RemoveContainer" containerID="21c2c92854ff801a5127d163d51b28bac3b1f94daff02fcb1e60619f1edad8f7" Jan 28 07:35:01 crc kubenswrapper[4776]: I0128 07:35:01.122659 4776 scope.go:117] "RemoveContainer" containerID="8cbb4ce304695d965224a1a5e3329eedfaade58a43f84bbf9ea357967dfe4222" Jan 28 07:35:01 crc kubenswrapper[4776]: I0128 07:35:01.173256 4776 scope.go:117] "RemoveContainer" containerID="941313d073f89167b3d8861e6897efa95e1c76154f7a8b6b0409b4dff8022313" Jan 28 07:35:01 crc kubenswrapper[4776]: E0128 07:35:01.173831 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"941313d073f89167b3d8861e6897efa95e1c76154f7a8b6b0409b4dff8022313\": container with ID starting with 941313d073f89167b3d8861e6897efa95e1c76154f7a8b6b0409b4dff8022313 not found: ID does not exist" containerID="941313d073f89167b3d8861e6897efa95e1c76154f7a8b6b0409b4dff8022313" Jan 28 07:35:01 crc kubenswrapper[4776]: I0128 07:35:01.173888 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"941313d073f89167b3d8861e6897efa95e1c76154f7a8b6b0409b4dff8022313"} err="failed to get container status \"941313d073f89167b3d8861e6897efa95e1c76154f7a8b6b0409b4dff8022313\": rpc error: code = NotFound desc = could not find container \"941313d073f89167b3d8861e6897efa95e1c76154f7a8b6b0409b4dff8022313\": container with ID starting with 941313d073f89167b3d8861e6897efa95e1c76154f7a8b6b0409b4dff8022313 not found: ID does not exist" Jan 28 07:35:01 crc kubenswrapper[4776]: I0128 07:35:01.173917 4776 scope.go:117] "RemoveContainer" containerID="21c2c92854ff801a5127d163d51b28bac3b1f94daff02fcb1e60619f1edad8f7" Jan 28 07:35:01 crc kubenswrapper[4776]: E0128 07:35:01.174381 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21c2c92854ff801a5127d163d51b28bac3b1f94daff02fcb1e60619f1edad8f7\": container with ID starting with 21c2c92854ff801a5127d163d51b28bac3b1f94daff02fcb1e60619f1edad8f7 not found: ID does not exist" containerID="21c2c92854ff801a5127d163d51b28bac3b1f94daff02fcb1e60619f1edad8f7" Jan 28 07:35:01 crc kubenswrapper[4776]: I0128 07:35:01.174445 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c2c92854ff801a5127d163d51b28bac3b1f94daff02fcb1e60619f1edad8f7"} err="failed to get container status \"21c2c92854ff801a5127d163d51b28bac3b1f94daff02fcb1e60619f1edad8f7\": rpc error: code = NotFound desc = could not find container \"21c2c92854ff801a5127d163d51b28bac3b1f94daff02fcb1e60619f1edad8f7\": container with ID starting with 21c2c92854ff801a5127d163d51b28bac3b1f94daff02fcb1e60619f1edad8f7 not found: ID does not exist" Jan 28 07:35:01 crc kubenswrapper[4776]: I0128 07:35:01.174497 4776 scope.go:117] "RemoveContainer" containerID="8cbb4ce304695d965224a1a5e3329eedfaade58a43f84bbf9ea357967dfe4222" Jan 28 07:35:01 crc kubenswrapper[4776]: E0128 07:35:01.174918 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cbb4ce304695d965224a1a5e3329eedfaade58a43f84bbf9ea357967dfe4222\": container with ID starting with 8cbb4ce304695d965224a1a5e3329eedfaade58a43f84bbf9ea357967dfe4222 not found: ID does not exist" containerID="8cbb4ce304695d965224a1a5e3329eedfaade58a43f84bbf9ea357967dfe4222" Jan 28 07:35:01 crc kubenswrapper[4776]: I0128 07:35:01.174948 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cbb4ce304695d965224a1a5e3329eedfaade58a43f84bbf9ea357967dfe4222"} err="failed to get container status \"8cbb4ce304695d965224a1a5e3329eedfaade58a43f84bbf9ea357967dfe4222\": rpc error: code = NotFound desc = could not find container \"8cbb4ce304695d965224a1a5e3329eedfaade58a43f84bbf9ea357967dfe4222\": container with ID starting with 8cbb4ce304695d965224a1a5e3329eedfaade58a43f84bbf9ea357967dfe4222 not found: ID does not exist" Jan 28 07:35:02 crc kubenswrapper[4776]: I0128 07:35:02.104636 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2896ed2f-0fb3-419e-9338-247138421520-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2896ed2f-0fb3-419e-9338-247138421520" (UID: "2896ed2f-0fb3-419e-9338-247138421520"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:35:02 crc kubenswrapper[4776]: I0128 07:35:02.186923 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2896ed2f-0fb3-419e-9338-247138421520-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:35:02 crc kubenswrapper[4776]: I0128 07:35:02.343135 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xgkpj"] Jan 28 07:35:02 crc kubenswrapper[4776]: I0128 07:35:02.350834 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xgkpj"] Jan 28 07:35:03 crc kubenswrapper[4776]: I0128 07:35:03.325162 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2896ed2f-0fb3-419e-9338-247138421520" path="/var/lib/kubelet/pods/2896ed2f-0fb3-419e-9338-247138421520/volumes" Jan 28 07:35:07 crc kubenswrapper[4776]: I0128 07:35:07.305685 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:35:07 crc kubenswrapper[4776]: E0128 07:35:07.306170 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:35:19 crc kubenswrapper[4776]: I0128 07:35:19.311776 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:35:19 crc kubenswrapper[4776]: E0128 07:35:19.312585 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:35:32 crc kubenswrapper[4776]: I0128 07:35:32.309494 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:35:32 crc kubenswrapper[4776]: E0128 07:35:32.310443 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:35:44 crc kubenswrapper[4776]: I0128 07:35:44.501188 4776 generic.go:334] "Generic (PLEG): container finished" podID="1dde4f71-00f6-46fa-b16c-429edb9ee1ce" containerID="b9620620d9d5cbe390d4c0c45dfce60343bc840d6c944462dca7ee560162b669" exitCode=0 Jan 28 07:35:44 crc kubenswrapper[4776]: I0128 07:35:44.501331 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" event={"ID":"1dde4f71-00f6-46fa-b16c-429edb9ee1ce","Type":"ContainerDied","Data":"b9620620d9d5cbe390d4c0c45dfce60343bc840d6c944462dca7ee560162b669"} Jan 28 07:35:45 crc kubenswrapper[4776]: I0128 07:35:45.992108 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.160072 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-2\") pod \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.160405 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-1\") pod \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.160594 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-inventory\") pod \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.160621 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ssh-key-openstack-edpm-ipam\") pod \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.160645 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzzpq\" (UniqueName: \"kubernetes.io/projected/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-kube-api-access-nzzpq\") pod \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.160662 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-0\") pod \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.160756 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-telemetry-combined-ca-bundle\") pod \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\" (UID: \"1dde4f71-00f6-46fa-b16c-429edb9ee1ce\") " Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.165882 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "1dde4f71-00f6-46fa-b16c-429edb9ee1ce" (UID: "1dde4f71-00f6-46fa-b16c-429edb9ee1ce"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.167466 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-kube-api-access-nzzpq" (OuterVolumeSpecName: "kube-api-access-nzzpq") pod "1dde4f71-00f6-46fa-b16c-429edb9ee1ce" (UID: "1dde4f71-00f6-46fa-b16c-429edb9ee1ce"). InnerVolumeSpecName "kube-api-access-nzzpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.191610 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "1dde4f71-00f6-46fa-b16c-429edb9ee1ce" (UID: "1dde4f71-00f6-46fa-b16c-429edb9ee1ce"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.191947 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-inventory" (OuterVolumeSpecName: "inventory") pod "1dde4f71-00f6-46fa-b16c-429edb9ee1ce" (UID: "1dde4f71-00f6-46fa-b16c-429edb9ee1ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.202038 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "1dde4f71-00f6-46fa-b16c-429edb9ee1ce" (UID: "1dde4f71-00f6-46fa-b16c-429edb9ee1ce"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.203446 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "1dde4f71-00f6-46fa-b16c-429edb9ee1ce" (UID: "1dde4f71-00f6-46fa-b16c-429edb9ee1ce"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.213373 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1dde4f71-00f6-46fa-b16c-429edb9ee1ce" (UID: "1dde4f71-00f6-46fa-b16c-429edb9ee1ce"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.263808 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.263861 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.263884 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzzpq\" (UniqueName: \"kubernetes.io/projected/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-kube-api-access-nzzpq\") on node \"crc\" DevicePath \"\"" Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.263902 4776 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.263920 4776 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.263939 4776 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.263956 4776 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1dde4f71-00f6-46fa-b16c-429edb9ee1ce-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.527523 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" event={"ID":"1dde4f71-00f6-46fa-b16c-429edb9ee1ce","Type":"ContainerDied","Data":"0795c4d2206803c0e86ddf4ef45051b83b196e14096491828f19198f008d08ad"} Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.527862 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0795c4d2206803c0e86ddf4ef45051b83b196e14096491828f19198f008d08ad" Jan 28 07:35:46 crc kubenswrapper[4776]: I0128 07:35:46.527599 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fp67m" Jan 28 07:35:47 crc kubenswrapper[4776]: I0128 07:35:47.308607 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:35:48 crc kubenswrapper[4776]: I0128 07:35:48.552503 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"fed86bb4db91dc0975b599ff4c252d854cf12a83f95685d05a3e121b9858944a"} Jan 28 07:36:24 crc kubenswrapper[4776]: I0128 07:36:24.633341 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 07:36:24 crc kubenswrapper[4776]: I0128 07:36:24.634235 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerName="prometheus" containerID="cri-o://31090420bce1d6f3da0ea75cf9ca0a58687b2f7a5dc24e0358130b8383f56904" gracePeriod=600 Jan 28 07:36:24 crc kubenswrapper[4776]: I0128 07:36:24.634758 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerName="thanos-sidecar" containerID="cri-o://f1f1284aa863882b26daa17086dc9c3ff606f662f50e3aeec890ee305e734ccb" gracePeriod=600 Jan 28 07:36:24 crc kubenswrapper[4776]: I0128 07:36:24.634818 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerName="config-reloader" containerID="cri-o://f6de7166c91f12ad2d592b26f3bf9d11436b545a336d99ef306cb0a69d3704d1" gracePeriod=600 Jan 28 07:36:24 crc kubenswrapper[4776]: I0128 07:36:24.942133 4776 generic.go:334] "Generic (PLEG): container finished" podID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerID="f1f1284aa863882b26daa17086dc9c3ff606f662f50e3aeec890ee305e734ccb" exitCode=0 Jan 28 07:36:24 crc kubenswrapper[4776]: I0128 07:36:24.942411 4776 generic.go:334] "Generic (PLEG): container finished" podID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerID="31090420bce1d6f3da0ea75cf9ca0a58687b2f7a5dc24e0358130b8383f56904" exitCode=0 Jan 28 07:36:24 crc kubenswrapper[4776]: I0128 07:36:24.942215 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"da477959-63db-4b5e-aef0-ca65915e6c3a","Type":"ContainerDied","Data":"f1f1284aa863882b26daa17086dc9c3ff606f662f50e3aeec890ee305e734ccb"} Jan 28 07:36:24 crc kubenswrapper[4776]: I0128 07:36:24.942462 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"da477959-63db-4b5e-aef0-ca65915e6c3a","Type":"ContainerDied","Data":"31090420bce1d6f3da0ea75cf9ca0a58687b2f7a5dc24e0358130b8383f56904"} Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.670680 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.810847 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-2\") pod \"da477959-63db-4b5e-aef0-ca65915e6c3a\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.811521 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") pod \"da477959-63db-4b5e-aef0-ca65915e6c3a\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.811655 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"da477959-63db-4b5e-aef0-ca65915e6c3a\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.811735 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-secret-combined-ca-bundle\") pod \"da477959-63db-4b5e-aef0-ca65915e6c3a\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.811793 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-thanos-prometheus-http-client-file\") pod \"da477959-63db-4b5e-aef0-ca65915e6c3a\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.811904 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/da477959-63db-4b5e-aef0-ca65915e6c3a-tls-assets\") pod \"da477959-63db-4b5e-aef0-ca65915e6c3a\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.811961 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9t4z\" (UniqueName: \"kubernetes.io/projected/da477959-63db-4b5e-aef0-ca65915e6c3a-kube-api-access-f9t4z\") pod \"da477959-63db-4b5e-aef0-ca65915e6c3a\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.812018 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-1\") pod \"da477959-63db-4b5e-aef0-ca65915e6c3a\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.812135 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/da477959-63db-4b5e-aef0-ca65915e6c3a-config-out\") pod \"da477959-63db-4b5e-aef0-ca65915e6c3a\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.812187 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config\") pod \"da477959-63db-4b5e-aef0-ca65915e6c3a\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.812239 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-0\") pod \"da477959-63db-4b5e-aef0-ca65915e6c3a\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.812278 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-config\") pod \"da477959-63db-4b5e-aef0-ca65915e6c3a\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.812354 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"da477959-63db-4b5e-aef0-ca65915e6c3a\" (UID: \"da477959-63db-4b5e-aef0-ca65915e6c3a\") " Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.812917 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "da477959-63db-4b5e-aef0-ca65915e6c3a" (UID: "da477959-63db-4b5e-aef0-ca65915e6c3a"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.813134 4776 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.814208 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "da477959-63db-4b5e-aef0-ca65915e6c3a" (UID: "da477959-63db-4b5e-aef0-ca65915e6c3a"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.814246 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "da477959-63db-4b5e-aef0-ca65915e6c3a" (UID: "da477959-63db-4b5e-aef0-ca65915e6c3a"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.822913 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "da477959-63db-4b5e-aef0-ca65915e6c3a" (UID: "da477959-63db-4b5e-aef0-ca65915e6c3a"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.824191 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "da477959-63db-4b5e-aef0-ca65915e6c3a" (UID: "da477959-63db-4b5e-aef0-ca65915e6c3a"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.824247 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "da477959-63db-4b5e-aef0-ca65915e6c3a" (UID: "da477959-63db-4b5e-aef0-ca65915e6c3a"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.824438 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da477959-63db-4b5e-aef0-ca65915e6c3a-kube-api-access-f9t4z" (OuterVolumeSpecName: "kube-api-access-f9t4z") pod "da477959-63db-4b5e-aef0-ca65915e6c3a" (UID: "da477959-63db-4b5e-aef0-ca65915e6c3a"). InnerVolumeSpecName "kube-api-access-f9t4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.824997 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "da477959-63db-4b5e-aef0-ca65915e6c3a" (UID: "da477959-63db-4b5e-aef0-ca65915e6c3a"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.825620 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da477959-63db-4b5e-aef0-ca65915e6c3a-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "da477959-63db-4b5e-aef0-ca65915e6c3a" (UID: "da477959-63db-4b5e-aef0-ca65915e6c3a"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.826229 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da477959-63db-4b5e-aef0-ca65915e6c3a-config-out" (OuterVolumeSpecName: "config-out") pod "da477959-63db-4b5e-aef0-ca65915e6c3a" (UID: "da477959-63db-4b5e-aef0-ca65915e6c3a"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.828640 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-config" (OuterVolumeSpecName: "config") pod "da477959-63db-4b5e-aef0-ca65915e6c3a" (UID: "da477959-63db-4b5e-aef0-ca65915e6c3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.840570 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "da477959-63db-4b5e-aef0-ca65915e6c3a" (UID: "da477959-63db-4b5e-aef0-ca65915e6c3a"). InnerVolumeSpecName "pvc-28337057-3ad3-471e-9736-ebdaa343fbf9". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.903110 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config" (OuterVolumeSpecName: "web-config") pod "da477959-63db-4b5e-aef0-ca65915e6c3a" (UID: "da477959-63db-4b5e-aef0-ca65915e6c3a"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.915430 4776 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.915469 4776 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.915483 4776 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.915499 4776 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/da477959-63db-4b5e-aef0-ca65915e6c3a-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.915515 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9t4z\" (UniqueName: \"kubernetes.io/projected/da477959-63db-4b5e-aef0-ca65915e6c3a-kube-api-access-f9t4z\") on node \"crc\" DevicePath \"\"" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.915529 4776 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.915539 4776 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/da477959-63db-4b5e-aef0-ca65915e6c3a-config-out\") on node \"crc\" DevicePath \"\"" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.915574 4776 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.915585 4776 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/da477959-63db-4b5e-aef0-ca65915e6c3a-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.915595 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-config\") on node \"crc\" DevicePath \"\"" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.915607 4776 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/da477959-63db-4b5e-aef0-ca65915e6c3a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.915653 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") on node \"crc\" " Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.940589 4776 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.940747 4776 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-28337057-3ad3-471e-9736-ebdaa343fbf9" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9") on node "crc" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.954959 4776 generic.go:334] "Generic (PLEG): container finished" podID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerID="f6de7166c91f12ad2d592b26f3bf9d11436b545a336d99ef306cb0a69d3704d1" exitCode=0 Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.955011 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"da477959-63db-4b5e-aef0-ca65915e6c3a","Type":"ContainerDied","Data":"f6de7166c91f12ad2d592b26f3bf9d11436b545a336d99ef306cb0a69d3704d1"} Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.955043 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"da477959-63db-4b5e-aef0-ca65915e6c3a","Type":"ContainerDied","Data":"30b3ba66e268e8279a8c8d06018dba5266fcb1cc572cef6631f11b05c219dd62"} Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.955061 4776 scope.go:117] "RemoveContainer" containerID="f1f1284aa863882b26daa17086dc9c3ff606f662f50e3aeec890ee305e734ccb" Jan 28 07:36:25 crc kubenswrapper[4776]: I0128 07:36:25.955411 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.007207 4776 scope.go:117] "RemoveContainer" containerID="f6de7166c91f12ad2d592b26f3bf9d11436b545a336d99ef306cb0a69d3704d1" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.008375 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.017827 4776 reconciler_common.go:293] "Volume detached for volume \"pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") on node \"crc\" DevicePath \"\"" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.019110 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.050560 4776 scope.go:117] "RemoveContainer" containerID="31090420bce1d6f3da0ea75cf9ca0a58687b2f7a5dc24e0358130b8383f56904" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.070136 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 07:36:26 crc kubenswrapper[4776]: E0128 07:36:26.070837 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerName="init-config-reloader" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.070855 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerName="init-config-reloader" Jan 28 07:36:26 crc kubenswrapper[4776]: E0128 07:36:26.070874 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2896ed2f-0fb3-419e-9338-247138421520" containerName="extract-utilities" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.070881 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2896ed2f-0fb3-419e-9338-247138421520" containerName="extract-utilities" Jan 28 07:36:26 crc kubenswrapper[4776]: E0128 07:36:26.070903 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2896ed2f-0fb3-419e-9338-247138421520" containerName="registry-server" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.070911 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2896ed2f-0fb3-419e-9338-247138421520" containerName="registry-server" Jan 28 07:36:26 crc kubenswrapper[4776]: E0128 07:36:26.070930 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerName="config-reloader" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.070935 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerName="config-reloader" Jan 28 07:36:26 crc kubenswrapper[4776]: E0128 07:36:26.070949 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dde4f71-00f6-46fa-b16c-429edb9ee1ce" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.070956 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dde4f71-00f6-46fa-b16c-429edb9ee1ce" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 28 07:36:26 crc kubenswrapper[4776]: E0128 07:36:26.070978 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerName="thanos-sidecar" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.070984 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerName="thanos-sidecar" Jan 28 07:36:26 crc kubenswrapper[4776]: E0128 07:36:26.070998 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2896ed2f-0fb3-419e-9338-247138421520" containerName="extract-content" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.071005 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2896ed2f-0fb3-419e-9338-247138421520" containerName="extract-content" Jan 28 07:36:26 crc kubenswrapper[4776]: E0128 07:36:26.071027 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerName="prometheus" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.071034 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerName="prometheus" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.071516 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerName="config-reloader" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.071561 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerName="prometheus" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.071580 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dde4f71-00f6-46fa-b16c-429edb9ee1ce" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.071608 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2896ed2f-0fb3-419e-9338-247138421520" containerName="registry-server" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.071628 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="da477959-63db-4b5e-aef0-ca65915e6c3a" containerName="thanos-sidecar" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.076173 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.083346 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.083932 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.084590 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.084634 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.084752 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.084964 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.089892 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.090755 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-lmp7n" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.093973 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.095895 4776 scope.go:117] "RemoveContainer" containerID="1ae72341d18eb712fe275dd8bae7d20c86127dcbe2a92b3a0e87866d61572b00" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.120823 4776 scope.go:117] "RemoveContainer" containerID="f1f1284aa863882b26daa17086dc9c3ff606f662f50e3aeec890ee305e734ccb" Jan 28 07:36:26 crc kubenswrapper[4776]: E0128 07:36:26.121257 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1f1284aa863882b26daa17086dc9c3ff606f662f50e3aeec890ee305e734ccb\": container with ID starting with f1f1284aa863882b26daa17086dc9c3ff606f662f50e3aeec890ee305e734ccb not found: ID does not exist" containerID="f1f1284aa863882b26daa17086dc9c3ff606f662f50e3aeec890ee305e734ccb" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.121301 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1f1284aa863882b26daa17086dc9c3ff606f662f50e3aeec890ee305e734ccb"} err="failed to get container status \"f1f1284aa863882b26daa17086dc9c3ff606f662f50e3aeec890ee305e734ccb\": rpc error: code = NotFound desc = could not find container \"f1f1284aa863882b26daa17086dc9c3ff606f662f50e3aeec890ee305e734ccb\": container with ID starting with f1f1284aa863882b26daa17086dc9c3ff606f662f50e3aeec890ee305e734ccb not found: ID does not exist" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.121327 4776 scope.go:117] "RemoveContainer" containerID="f6de7166c91f12ad2d592b26f3bf9d11436b545a336d99ef306cb0a69d3704d1" Jan 28 07:36:26 crc kubenswrapper[4776]: E0128 07:36:26.121794 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6de7166c91f12ad2d592b26f3bf9d11436b545a336d99ef306cb0a69d3704d1\": container with ID starting with f6de7166c91f12ad2d592b26f3bf9d11436b545a336d99ef306cb0a69d3704d1 not found: ID does not exist" containerID="f6de7166c91f12ad2d592b26f3bf9d11436b545a336d99ef306cb0a69d3704d1" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.121835 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6de7166c91f12ad2d592b26f3bf9d11436b545a336d99ef306cb0a69d3704d1"} err="failed to get container status \"f6de7166c91f12ad2d592b26f3bf9d11436b545a336d99ef306cb0a69d3704d1\": rpc error: code = NotFound desc = could not find container \"f6de7166c91f12ad2d592b26f3bf9d11436b545a336d99ef306cb0a69d3704d1\": container with ID starting with f6de7166c91f12ad2d592b26f3bf9d11436b545a336d99ef306cb0a69d3704d1 not found: ID does not exist" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.121856 4776 scope.go:117] "RemoveContainer" containerID="31090420bce1d6f3da0ea75cf9ca0a58687b2f7a5dc24e0358130b8383f56904" Jan 28 07:36:26 crc kubenswrapper[4776]: E0128 07:36:26.122093 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31090420bce1d6f3da0ea75cf9ca0a58687b2f7a5dc24e0358130b8383f56904\": container with ID starting with 31090420bce1d6f3da0ea75cf9ca0a58687b2f7a5dc24e0358130b8383f56904 not found: ID does not exist" containerID="31090420bce1d6f3da0ea75cf9ca0a58687b2f7a5dc24e0358130b8383f56904" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.122110 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31090420bce1d6f3da0ea75cf9ca0a58687b2f7a5dc24e0358130b8383f56904"} err="failed to get container status \"31090420bce1d6f3da0ea75cf9ca0a58687b2f7a5dc24e0358130b8383f56904\": rpc error: code = NotFound desc = could not find container \"31090420bce1d6f3da0ea75cf9ca0a58687b2f7a5dc24e0358130b8383f56904\": container with ID starting with 31090420bce1d6f3da0ea75cf9ca0a58687b2f7a5dc24e0358130b8383f56904 not found: ID does not exist" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.122121 4776 scope.go:117] "RemoveContainer" containerID="1ae72341d18eb712fe275dd8bae7d20c86127dcbe2a92b3a0e87866d61572b00" Jan 28 07:36:26 crc kubenswrapper[4776]: E0128 07:36:26.122417 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae72341d18eb712fe275dd8bae7d20c86127dcbe2a92b3a0e87866d61572b00\": container with ID starting with 1ae72341d18eb712fe275dd8bae7d20c86127dcbe2a92b3a0e87866d61572b00 not found: ID does not exist" containerID="1ae72341d18eb712fe275dd8bae7d20c86127dcbe2a92b3a0e87866d61572b00" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.122446 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae72341d18eb712fe275dd8bae7d20c86127dcbe2a92b3a0e87866d61572b00"} err="failed to get container status \"1ae72341d18eb712fe275dd8bae7d20c86127dcbe2a92b3a0e87866d61572b00\": rpc error: code = NotFound desc = could not find container \"1ae72341d18eb712fe275dd8bae7d20c86127dcbe2a92b3a0e87866d61572b00\": container with ID starting with 1ae72341d18eb712fe275dd8bae7d20c86127dcbe2a92b3a0e87866d61572b00 not found: ID does not exist" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.221787 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.221839 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.221872 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nvlx\" (UniqueName: \"kubernetes.io/projected/e5a41440-d466-4d04-adb9-13760bb7977a-kube-api-access-4nvlx\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.221930 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e5a41440-d466-4d04-adb9-13760bb7977a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.221966 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.221994 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e5a41440-d466-4d04-adb9-13760bb7977a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.222021 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e5a41440-d466-4d04-adb9-13760bb7977a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.222037 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e5a41440-d466-4d04-adb9-13760bb7977a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.222056 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.222104 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e5a41440-d466-4d04-adb9-13760bb7977a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.222123 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.222162 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-config\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.222187 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.323691 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-config\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.323745 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.323774 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.323798 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.323829 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nvlx\" (UniqueName: \"kubernetes.io/projected/e5a41440-d466-4d04-adb9-13760bb7977a-kube-api-access-4nvlx\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.323879 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e5a41440-d466-4d04-adb9-13760bb7977a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.323912 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.323930 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e5a41440-d466-4d04-adb9-13760bb7977a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.323949 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e5a41440-d466-4d04-adb9-13760bb7977a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.323975 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e5a41440-d466-4d04-adb9-13760bb7977a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.323998 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.324044 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e5a41440-d466-4d04-adb9-13760bb7977a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.324066 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.324836 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e5a41440-d466-4d04-adb9-13760bb7977a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.325122 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e5a41440-d466-4d04-adb9-13760bb7977a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.325282 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e5a41440-d466-4d04-adb9-13760bb7977a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.326842 4776 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.327073 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2e9c055391cfae11cfbe6abdf3b945738020df5dfe58cd3a482199e94820340b/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.328266 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e5a41440-d466-4d04-adb9-13760bb7977a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.329018 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.329199 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.329416 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.329950 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.331964 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-config\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.338390 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e5a41440-d466-4d04-adb9-13760bb7977a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.342772 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nvlx\" (UniqueName: \"kubernetes.io/projected/e5a41440-d466-4d04-adb9-13760bb7977a-kube-api-access-4nvlx\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.361254 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e5a41440-d466-4d04-adb9-13760bb7977a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.367023 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28337057-3ad3-471e-9736-ebdaa343fbf9\") pod \"prometheus-metric-storage-0\" (UID: \"e5a41440-d466-4d04-adb9-13760bb7977a\") " pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.454311 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.923283 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 07:36:26 crc kubenswrapper[4776]: W0128 07:36:26.941281 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5a41440_d466_4d04_adb9_13760bb7977a.slice/crio-ecfe3677bda45b524daa4dc34ed81759dd1b50dabd1db986563500f6653cb20c WatchSource:0}: Error finding container ecfe3677bda45b524daa4dc34ed81759dd1b50dabd1db986563500f6653cb20c: Status 404 returned error can't find the container with id ecfe3677bda45b524daa4dc34ed81759dd1b50dabd1db986563500f6653cb20c Jan 28 07:36:26 crc kubenswrapper[4776]: I0128 07:36:26.965959 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e5a41440-d466-4d04-adb9-13760bb7977a","Type":"ContainerStarted","Data":"ecfe3677bda45b524daa4dc34ed81759dd1b50dabd1db986563500f6653cb20c"} Jan 28 07:36:27 crc kubenswrapper[4776]: I0128 07:36:27.316560 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da477959-63db-4b5e-aef0-ca65915e6c3a" path="/var/lib/kubelet/pods/da477959-63db-4b5e-aef0-ca65915e6c3a/volumes" Jan 28 07:36:32 crc kubenswrapper[4776]: I0128 07:36:32.075667 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e5a41440-d466-4d04-adb9-13760bb7977a","Type":"ContainerStarted","Data":"0c75f77cfd45a471b2210dce89d004e1d172f366c8f71d96b4743c9cf7c72b99"} Jan 28 07:36:39 crc kubenswrapper[4776]: I0128 07:36:39.158307 4776 generic.go:334] "Generic (PLEG): container finished" podID="e5a41440-d466-4d04-adb9-13760bb7977a" containerID="0c75f77cfd45a471b2210dce89d004e1d172f366c8f71d96b4743c9cf7c72b99" exitCode=0 Jan 28 07:36:39 crc kubenswrapper[4776]: I0128 07:36:39.159042 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e5a41440-d466-4d04-adb9-13760bb7977a","Type":"ContainerDied","Data":"0c75f77cfd45a471b2210dce89d004e1d172f366c8f71d96b4743c9cf7c72b99"} Jan 28 07:36:40 crc kubenswrapper[4776]: I0128 07:36:40.175120 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e5a41440-d466-4d04-adb9-13760bb7977a","Type":"ContainerStarted","Data":"0ff0b54a286702b1eae00fcc7c80c6977b6ebd46be6b246d3f15ade959a3736e"} Jan 28 07:36:44 crc kubenswrapper[4776]: I0128 07:36:44.214306 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e5a41440-d466-4d04-adb9-13760bb7977a","Type":"ContainerStarted","Data":"44450f19cea34e91807eae16f42d18b75380555efb3423b7e5f262a843be7000"} Jan 28 07:36:45 crc kubenswrapper[4776]: I0128 07:36:45.224942 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e5a41440-d466-4d04-adb9-13760bb7977a","Type":"ContainerStarted","Data":"40e5a6d1ffe12472b5d76db867bfdbfa31543fb564c3f0c446c16de619437cf9"} Jan 28 07:36:45 crc kubenswrapper[4776]: I0128 07:36:45.260047 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=19.260027179 podStartE2EDuration="19.260027179s" podCreationTimestamp="2026-01-28 07:36:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 07:36:45.252868686 +0000 UTC m=+2776.668528866" watchObservedRunningTime="2026-01-28 07:36:45.260027179 +0000 UTC m=+2776.675687329" Jan 28 07:36:46 crc kubenswrapper[4776]: I0128 07:36:46.455226 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:47 crc kubenswrapper[4776]: I0128 07:36:47.338301 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hcx9j"] Jan 28 07:36:47 crc kubenswrapper[4776]: I0128 07:36:47.343164 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hcx9j" Jan 28 07:36:47 crc kubenswrapper[4776]: I0128 07:36:47.365860 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hcx9j"] Jan 28 07:36:47 crc kubenswrapper[4776]: I0128 07:36:47.539030 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23852696-f715-4a5d-ada0-70d73e800891-utilities\") pod \"certified-operators-hcx9j\" (UID: \"23852696-f715-4a5d-ada0-70d73e800891\") " pod="openshift-marketplace/certified-operators-hcx9j" Jan 28 07:36:47 crc kubenswrapper[4776]: I0128 07:36:47.539146 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23852696-f715-4a5d-ada0-70d73e800891-catalog-content\") pod \"certified-operators-hcx9j\" (UID: \"23852696-f715-4a5d-ada0-70d73e800891\") " pod="openshift-marketplace/certified-operators-hcx9j" Jan 28 07:36:47 crc kubenswrapper[4776]: I0128 07:36:47.539449 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt859\" (UniqueName: \"kubernetes.io/projected/23852696-f715-4a5d-ada0-70d73e800891-kube-api-access-xt859\") pod \"certified-operators-hcx9j\" (UID: \"23852696-f715-4a5d-ada0-70d73e800891\") " pod="openshift-marketplace/certified-operators-hcx9j" Jan 28 07:36:47 crc kubenswrapper[4776]: I0128 07:36:47.640975 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt859\" (UniqueName: \"kubernetes.io/projected/23852696-f715-4a5d-ada0-70d73e800891-kube-api-access-xt859\") pod \"certified-operators-hcx9j\" (UID: \"23852696-f715-4a5d-ada0-70d73e800891\") " pod="openshift-marketplace/certified-operators-hcx9j" Jan 28 07:36:47 crc kubenswrapper[4776]: I0128 07:36:47.641427 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23852696-f715-4a5d-ada0-70d73e800891-utilities\") pod \"certified-operators-hcx9j\" (UID: \"23852696-f715-4a5d-ada0-70d73e800891\") " pod="openshift-marketplace/certified-operators-hcx9j" Jan 28 07:36:47 crc kubenswrapper[4776]: I0128 07:36:47.641489 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23852696-f715-4a5d-ada0-70d73e800891-catalog-content\") pod \"certified-operators-hcx9j\" (UID: \"23852696-f715-4a5d-ada0-70d73e800891\") " pod="openshift-marketplace/certified-operators-hcx9j" Jan 28 07:36:47 crc kubenswrapper[4776]: I0128 07:36:47.642032 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23852696-f715-4a5d-ada0-70d73e800891-catalog-content\") pod \"certified-operators-hcx9j\" (UID: \"23852696-f715-4a5d-ada0-70d73e800891\") " pod="openshift-marketplace/certified-operators-hcx9j" Jan 28 07:36:47 crc kubenswrapper[4776]: I0128 07:36:47.642657 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23852696-f715-4a5d-ada0-70d73e800891-utilities\") pod \"certified-operators-hcx9j\" (UID: \"23852696-f715-4a5d-ada0-70d73e800891\") " pod="openshift-marketplace/certified-operators-hcx9j" Jan 28 07:36:47 crc kubenswrapper[4776]: I0128 07:36:47.666404 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt859\" (UniqueName: \"kubernetes.io/projected/23852696-f715-4a5d-ada0-70d73e800891-kube-api-access-xt859\") pod \"certified-operators-hcx9j\" (UID: \"23852696-f715-4a5d-ada0-70d73e800891\") " pod="openshift-marketplace/certified-operators-hcx9j" Jan 28 07:36:47 crc kubenswrapper[4776]: I0128 07:36:47.683249 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hcx9j" Jan 28 07:36:48 crc kubenswrapper[4776]: I0128 07:36:48.021973 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hcx9j"] Jan 28 07:36:48 crc kubenswrapper[4776]: W0128 07:36:48.029562 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23852696_f715_4a5d_ada0_70d73e800891.slice/crio-128dc679a99e431dffa2f751817b9e121c7c00fa62d8c93cc12d0fe0dbe90175 WatchSource:0}: Error finding container 128dc679a99e431dffa2f751817b9e121c7c00fa62d8c93cc12d0fe0dbe90175: Status 404 returned error can't find the container with id 128dc679a99e431dffa2f751817b9e121c7c00fa62d8c93cc12d0fe0dbe90175 Jan 28 07:36:48 crc kubenswrapper[4776]: I0128 07:36:48.263204 4776 generic.go:334] "Generic (PLEG): container finished" podID="23852696-f715-4a5d-ada0-70d73e800891" containerID="7f9751b1a66009d483581900ba232adacb9094c2c4e26098a882622f87dd5955" exitCode=0 Jan 28 07:36:48 crc kubenswrapper[4776]: I0128 07:36:48.263243 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcx9j" event={"ID":"23852696-f715-4a5d-ada0-70d73e800891","Type":"ContainerDied","Data":"7f9751b1a66009d483581900ba232adacb9094c2c4e26098a882622f87dd5955"} Jan 28 07:36:48 crc kubenswrapper[4776]: I0128 07:36:48.263267 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcx9j" event={"ID":"23852696-f715-4a5d-ada0-70d73e800891","Type":"ContainerStarted","Data":"128dc679a99e431dffa2f751817b9e121c7c00fa62d8c93cc12d0fe0dbe90175"} Jan 28 07:36:50 crc kubenswrapper[4776]: I0128 07:36:50.289469 4776 generic.go:334] "Generic (PLEG): container finished" podID="23852696-f715-4a5d-ada0-70d73e800891" containerID="5839e865565508093390c039eb3c938e080e66dd9abadf89f17572a77759152e" exitCode=0 Jan 28 07:36:50 crc kubenswrapper[4776]: I0128 07:36:50.290290 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcx9j" event={"ID":"23852696-f715-4a5d-ada0-70d73e800891","Type":"ContainerDied","Data":"5839e865565508093390c039eb3c938e080e66dd9abadf89f17572a77759152e"} Jan 28 07:36:51 crc kubenswrapper[4776]: I0128 07:36:51.301243 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcx9j" event={"ID":"23852696-f715-4a5d-ada0-70d73e800891","Type":"ContainerStarted","Data":"6795d9407fbc5edf25fc57f0846348df1bd8792dcd0f5d1889ad1face545b318"} Jan 28 07:36:51 crc kubenswrapper[4776]: I0128 07:36:51.331376 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hcx9j" podStartSLOduration=1.783045248 podStartE2EDuration="4.331355563s" podCreationTimestamp="2026-01-28 07:36:47 +0000 UTC" firstStartedPulling="2026-01-28 07:36:48.264909106 +0000 UTC m=+2779.680569266" lastFinishedPulling="2026-01-28 07:36:50.813219381 +0000 UTC m=+2782.228879581" observedRunningTime="2026-01-28 07:36:51.32790321 +0000 UTC m=+2782.743563380" watchObservedRunningTime="2026-01-28 07:36:51.331355563 +0000 UTC m=+2782.747015723" Jan 28 07:36:56 crc kubenswrapper[4776]: I0128 07:36:56.454755 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:56 crc kubenswrapper[4776]: I0128 07:36:56.461313 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:57 crc kubenswrapper[4776]: I0128 07:36:57.398576 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 28 07:36:57 crc kubenswrapper[4776]: I0128 07:36:57.683992 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hcx9j" Jan 28 07:36:57 crc kubenswrapper[4776]: I0128 07:36:57.684043 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hcx9j" Jan 28 07:36:57 crc kubenswrapper[4776]: I0128 07:36:57.728684 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hcx9j" Jan 28 07:36:58 crc kubenswrapper[4776]: I0128 07:36:58.494210 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hcx9j" Jan 28 07:36:58 crc kubenswrapper[4776]: I0128 07:36:58.572941 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hcx9j"] Jan 28 07:37:00 crc kubenswrapper[4776]: I0128 07:37:00.425628 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hcx9j" podUID="23852696-f715-4a5d-ada0-70d73e800891" containerName="registry-server" containerID="cri-o://6795d9407fbc5edf25fc57f0846348df1bd8792dcd0f5d1889ad1face545b318" gracePeriod=2 Jan 28 07:37:00 crc kubenswrapper[4776]: I0128 07:37:00.968312 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hcx9j" Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.037390 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23852696-f715-4a5d-ada0-70d73e800891-catalog-content\") pod \"23852696-f715-4a5d-ada0-70d73e800891\" (UID: \"23852696-f715-4a5d-ada0-70d73e800891\") " Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.037775 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt859\" (UniqueName: \"kubernetes.io/projected/23852696-f715-4a5d-ada0-70d73e800891-kube-api-access-xt859\") pod \"23852696-f715-4a5d-ada0-70d73e800891\" (UID: \"23852696-f715-4a5d-ada0-70d73e800891\") " Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.037908 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23852696-f715-4a5d-ada0-70d73e800891-utilities\") pod \"23852696-f715-4a5d-ada0-70d73e800891\" (UID: \"23852696-f715-4a5d-ada0-70d73e800891\") " Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.039308 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23852696-f715-4a5d-ada0-70d73e800891-utilities" (OuterVolumeSpecName: "utilities") pod "23852696-f715-4a5d-ada0-70d73e800891" (UID: "23852696-f715-4a5d-ada0-70d73e800891"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.051206 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23852696-f715-4a5d-ada0-70d73e800891-kube-api-access-xt859" (OuterVolumeSpecName: "kube-api-access-xt859") pod "23852696-f715-4a5d-ada0-70d73e800891" (UID: "23852696-f715-4a5d-ada0-70d73e800891"). InnerVolumeSpecName "kube-api-access-xt859". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.097909 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23852696-f715-4a5d-ada0-70d73e800891-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23852696-f715-4a5d-ada0-70d73e800891" (UID: "23852696-f715-4a5d-ada0-70d73e800891"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.140668 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt859\" (UniqueName: \"kubernetes.io/projected/23852696-f715-4a5d-ada0-70d73e800891-kube-api-access-xt859\") on node \"crc\" DevicePath \"\"" Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.140707 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23852696-f715-4a5d-ada0-70d73e800891-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.140720 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23852696-f715-4a5d-ada0-70d73e800891-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.438682 4776 generic.go:334] "Generic (PLEG): container finished" podID="23852696-f715-4a5d-ada0-70d73e800891" containerID="6795d9407fbc5edf25fc57f0846348df1bd8792dcd0f5d1889ad1face545b318" exitCode=0 Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.438884 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcx9j" event={"ID":"23852696-f715-4a5d-ada0-70d73e800891","Type":"ContainerDied","Data":"6795d9407fbc5edf25fc57f0846348df1bd8792dcd0f5d1889ad1face545b318"} Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.439993 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcx9j" event={"ID":"23852696-f715-4a5d-ada0-70d73e800891","Type":"ContainerDied","Data":"128dc679a99e431dffa2f751817b9e121c7c00fa62d8c93cc12d0fe0dbe90175"} Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.440099 4776 scope.go:117] "RemoveContainer" containerID="6795d9407fbc5edf25fc57f0846348df1bd8792dcd0f5d1889ad1face545b318" Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.439049 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hcx9j" Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.475596 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hcx9j"] Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.480044 4776 scope.go:117] "RemoveContainer" containerID="5839e865565508093390c039eb3c938e080e66dd9abadf89f17572a77759152e" Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.486244 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hcx9j"] Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.517994 4776 scope.go:117] "RemoveContainer" containerID="7f9751b1a66009d483581900ba232adacb9094c2c4e26098a882622f87dd5955" Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.640453 4776 scope.go:117] "RemoveContainer" containerID="6795d9407fbc5edf25fc57f0846348df1bd8792dcd0f5d1889ad1face545b318" Jan 28 07:37:01 crc kubenswrapper[4776]: E0128 07:37:01.642428 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6795d9407fbc5edf25fc57f0846348df1bd8792dcd0f5d1889ad1face545b318\": container with ID starting with 6795d9407fbc5edf25fc57f0846348df1bd8792dcd0f5d1889ad1face545b318 not found: ID does not exist" containerID="6795d9407fbc5edf25fc57f0846348df1bd8792dcd0f5d1889ad1face545b318" Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.642501 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6795d9407fbc5edf25fc57f0846348df1bd8792dcd0f5d1889ad1face545b318"} err="failed to get container status \"6795d9407fbc5edf25fc57f0846348df1bd8792dcd0f5d1889ad1face545b318\": rpc error: code = NotFound desc = could not find container \"6795d9407fbc5edf25fc57f0846348df1bd8792dcd0f5d1889ad1face545b318\": container with ID starting with 6795d9407fbc5edf25fc57f0846348df1bd8792dcd0f5d1889ad1face545b318 not found: ID does not exist" Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.642572 4776 scope.go:117] "RemoveContainer" containerID="5839e865565508093390c039eb3c938e080e66dd9abadf89f17572a77759152e" Jan 28 07:37:01 crc kubenswrapper[4776]: E0128 07:37:01.646220 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5839e865565508093390c039eb3c938e080e66dd9abadf89f17572a77759152e\": container with ID starting with 5839e865565508093390c039eb3c938e080e66dd9abadf89f17572a77759152e not found: ID does not exist" containerID="5839e865565508093390c039eb3c938e080e66dd9abadf89f17572a77759152e" Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.646285 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5839e865565508093390c039eb3c938e080e66dd9abadf89f17572a77759152e"} err="failed to get container status \"5839e865565508093390c039eb3c938e080e66dd9abadf89f17572a77759152e\": rpc error: code = NotFound desc = could not find container \"5839e865565508093390c039eb3c938e080e66dd9abadf89f17572a77759152e\": container with ID starting with 5839e865565508093390c039eb3c938e080e66dd9abadf89f17572a77759152e not found: ID does not exist" Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.646319 4776 scope.go:117] "RemoveContainer" containerID="7f9751b1a66009d483581900ba232adacb9094c2c4e26098a882622f87dd5955" Jan 28 07:37:01 crc kubenswrapper[4776]: E0128 07:37:01.646905 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f9751b1a66009d483581900ba232adacb9094c2c4e26098a882622f87dd5955\": container with ID starting with 7f9751b1a66009d483581900ba232adacb9094c2c4e26098a882622f87dd5955 not found: ID does not exist" containerID="7f9751b1a66009d483581900ba232adacb9094c2c4e26098a882622f87dd5955" Jan 28 07:37:01 crc kubenswrapper[4776]: I0128 07:37:01.646956 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9751b1a66009d483581900ba232adacb9094c2c4e26098a882622f87dd5955"} err="failed to get container status \"7f9751b1a66009d483581900ba232adacb9094c2c4e26098a882622f87dd5955\": rpc error: code = NotFound desc = could not find container \"7f9751b1a66009d483581900ba232adacb9094c2c4e26098a882622f87dd5955\": container with ID starting with 7f9751b1a66009d483581900ba232adacb9094c2c4e26098a882622f87dd5955 not found: ID does not exist" Jan 28 07:37:03 crc kubenswrapper[4776]: I0128 07:37:03.314536 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23852696-f715-4a5d-ada0-70d73e800891" path="/var/lib/kubelet/pods/23852696-f715-4a5d-ada0-70d73e800891/volumes" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.560846 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Jan 28 07:37:20 crc kubenswrapper[4776]: E0128 07:37:20.561874 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23852696-f715-4a5d-ada0-70d73e800891" containerName="extract-utilities" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.561890 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="23852696-f715-4a5d-ada0-70d73e800891" containerName="extract-utilities" Jan 28 07:37:20 crc kubenswrapper[4776]: E0128 07:37:20.561908 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23852696-f715-4a5d-ada0-70d73e800891" containerName="registry-server" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.561915 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="23852696-f715-4a5d-ada0-70d73e800891" containerName="registry-server" Jan 28 07:37:20 crc kubenswrapper[4776]: E0128 07:37:20.561955 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23852696-f715-4a5d-ada0-70d73e800891" containerName="extract-content" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.561962 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="23852696-f715-4a5d-ada0-70d73e800891" containerName="extract-content" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.562148 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="23852696-f715-4a5d-ada0-70d73e800891" containerName="registry-server" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.562908 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.565801 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-86mg2" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.565940 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.566350 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.566442 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.591186 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.760341 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvkfj\" (UniqueName: \"kubernetes.io/projected/0605b294-d429-4bfd-8924-39f8cb5cb105-kube-api-access-hvkfj\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.760585 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.760635 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0605b294-d429-4bfd-8924-39f8cb5cb105-config-data\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.760738 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.760822 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0605b294-d429-4bfd-8924-39f8cb5cb105-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.760866 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0605b294-d429-4bfd-8924-39f8cb5cb105-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.760907 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.760961 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0605b294-d429-4bfd-8924-39f8cb5cb105-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.761078 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.862533 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvkfj\" (UniqueName: \"kubernetes.io/projected/0605b294-d429-4bfd-8924-39f8cb5cb105-kube-api-access-hvkfj\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.862712 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.862752 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0605b294-d429-4bfd-8924-39f8cb5cb105-config-data\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.862806 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.862866 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0605b294-d429-4bfd-8924-39f8cb5cb105-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.862897 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0605b294-d429-4bfd-8924-39f8cb5cb105-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.862925 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.862962 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0605b294-d429-4bfd-8924-39f8cb5cb105-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.863017 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.863845 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.863918 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0605b294-d429-4bfd-8924-39f8cb5cb105-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.864402 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0605b294-d429-4bfd-8924-39f8cb5cb105-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.864854 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0605b294-d429-4bfd-8924-39f8cb5cb105-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.866596 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0605b294-d429-4bfd-8924-39f8cb5cb105-config-data\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.871198 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.871388 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.873894 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.891457 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvkfj\" (UniqueName: \"kubernetes.io/projected/0605b294-d429-4bfd-8924-39f8cb5cb105-kube-api-access-hvkfj\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.892265 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " pod="openstack/tempest-tests-tempest" Jan 28 07:37:20 crc kubenswrapper[4776]: I0128 07:37:20.912948 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 28 07:37:21 crc kubenswrapper[4776]: I0128 07:37:21.444940 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 28 07:37:21 crc kubenswrapper[4776]: I0128 07:37:21.660753 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0605b294-d429-4bfd-8924-39f8cb5cb105","Type":"ContainerStarted","Data":"49258267d9dcabfa04b7c7676bb04e4b80c75105fca1f4bc9300342f767706bd"} Jan 28 07:37:32 crc kubenswrapper[4776]: I0128 07:37:32.567042 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 28 07:37:33 crc kubenswrapper[4776]: I0128 07:37:33.784249 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0605b294-d429-4bfd-8924-39f8cb5cb105","Type":"ContainerStarted","Data":"7c732218fd60783b0babc90bbb805fb43a92393873f5febb8b1b0b5dbf501a07"} Jan 28 07:37:33 crc kubenswrapper[4776]: I0128 07:37:33.803561 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.687305495 podStartE2EDuration="14.803520836s" podCreationTimestamp="2026-01-28 07:37:19 +0000 UTC" firstStartedPulling="2026-01-28 07:37:21.447000189 +0000 UTC m=+2812.862660349" lastFinishedPulling="2026-01-28 07:37:32.56321552 +0000 UTC m=+2823.978875690" observedRunningTime="2026-01-28 07:37:33.798647334 +0000 UTC m=+2825.214307504" watchObservedRunningTime="2026-01-28 07:37:33.803520836 +0000 UTC m=+2825.219180996" Jan 28 07:37:49 crc kubenswrapper[4776]: I0128 07:37:49.918628 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bzxbg"] Jan 28 07:37:49 crc kubenswrapper[4776]: I0128 07:37:49.922238 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzxbg" Jan 28 07:37:49 crc kubenswrapper[4776]: I0128 07:37:49.928320 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bzxbg"] Jan 28 07:37:50 crc kubenswrapper[4776]: I0128 07:37:50.024103 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqqkg\" (UniqueName: \"kubernetes.io/projected/71432eac-c873-4fb2-a5ca-8e00edfc3120-kube-api-access-kqqkg\") pod \"redhat-operators-bzxbg\" (UID: \"71432eac-c873-4fb2-a5ca-8e00edfc3120\") " pod="openshift-marketplace/redhat-operators-bzxbg" Jan 28 07:37:50 crc kubenswrapper[4776]: I0128 07:37:50.024302 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71432eac-c873-4fb2-a5ca-8e00edfc3120-utilities\") pod \"redhat-operators-bzxbg\" (UID: \"71432eac-c873-4fb2-a5ca-8e00edfc3120\") " pod="openshift-marketplace/redhat-operators-bzxbg" Jan 28 07:37:50 crc kubenswrapper[4776]: I0128 07:37:50.024355 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71432eac-c873-4fb2-a5ca-8e00edfc3120-catalog-content\") pod \"redhat-operators-bzxbg\" (UID: \"71432eac-c873-4fb2-a5ca-8e00edfc3120\") " pod="openshift-marketplace/redhat-operators-bzxbg" Jan 28 07:37:50 crc kubenswrapper[4776]: I0128 07:37:50.126644 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71432eac-c873-4fb2-a5ca-8e00edfc3120-utilities\") pod \"redhat-operators-bzxbg\" (UID: \"71432eac-c873-4fb2-a5ca-8e00edfc3120\") " pod="openshift-marketplace/redhat-operators-bzxbg" Jan 28 07:37:50 crc kubenswrapper[4776]: I0128 07:37:50.126713 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71432eac-c873-4fb2-a5ca-8e00edfc3120-catalog-content\") pod \"redhat-operators-bzxbg\" (UID: \"71432eac-c873-4fb2-a5ca-8e00edfc3120\") " pod="openshift-marketplace/redhat-operators-bzxbg" Jan 28 07:37:50 crc kubenswrapper[4776]: I0128 07:37:50.126783 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqqkg\" (UniqueName: \"kubernetes.io/projected/71432eac-c873-4fb2-a5ca-8e00edfc3120-kube-api-access-kqqkg\") pod \"redhat-operators-bzxbg\" (UID: \"71432eac-c873-4fb2-a5ca-8e00edfc3120\") " pod="openshift-marketplace/redhat-operators-bzxbg" Jan 28 07:37:50 crc kubenswrapper[4776]: I0128 07:37:50.127245 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71432eac-c873-4fb2-a5ca-8e00edfc3120-catalog-content\") pod \"redhat-operators-bzxbg\" (UID: \"71432eac-c873-4fb2-a5ca-8e00edfc3120\") " pod="openshift-marketplace/redhat-operators-bzxbg" Jan 28 07:37:50 crc kubenswrapper[4776]: I0128 07:37:50.127255 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71432eac-c873-4fb2-a5ca-8e00edfc3120-utilities\") pod \"redhat-operators-bzxbg\" (UID: \"71432eac-c873-4fb2-a5ca-8e00edfc3120\") " pod="openshift-marketplace/redhat-operators-bzxbg" Jan 28 07:37:50 crc kubenswrapper[4776]: I0128 07:37:50.151855 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqqkg\" (UniqueName: \"kubernetes.io/projected/71432eac-c873-4fb2-a5ca-8e00edfc3120-kube-api-access-kqqkg\") pod \"redhat-operators-bzxbg\" (UID: \"71432eac-c873-4fb2-a5ca-8e00edfc3120\") " pod="openshift-marketplace/redhat-operators-bzxbg" Jan 28 07:37:50 crc kubenswrapper[4776]: I0128 07:37:50.259859 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzxbg" Jan 28 07:37:50 crc kubenswrapper[4776]: I0128 07:37:50.757649 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bzxbg"] Jan 28 07:37:50 crc kubenswrapper[4776]: I0128 07:37:50.989658 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzxbg" event={"ID":"71432eac-c873-4fb2-a5ca-8e00edfc3120","Type":"ContainerStarted","Data":"f7e9150778bdf3ae40a2033166dedf92766ff24c2c3196c6cad3dc50392fd488"} Jan 28 07:37:50 crc kubenswrapper[4776]: I0128 07:37:50.990035 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzxbg" event={"ID":"71432eac-c873-4fb2-a5ca-8e00edfc3120","Type":"ContainerStarted","Data":"f9d35d610bd376ddfdfd7d38f4379a5e352aae558e2ad900e9598fe062734fdb"} Jan 28 07:37:52 crc kubenswrapper[4776]: I0128 07:37:52.003200 4776 generic.go:334] "Generic (PLEG): container finished" podID="71432eac-c873-4fb2-a5ca-8e00edfc3120" containerID="f7e9150778bdf3ae40a2033166dedf92766ff24c2c3196c6cad3dc50392fd488" exitCode=0 Jan 28 07:37:52 crc kubenswrapper[4776]: I0128 07:37:52.003286 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzxbg" event={"ID":"71432eac-c873-4fb2-a5ca-8e00edfc3120","Type":"ContainerDied","Data":"f7e9150778bdf3ae40a2033166dedf92766ff24c2c3196c6cad3dc50392fd488"} Jan 28 07:37:54 crc kubenswrapper[4776]: I0128 07:37:54.035287 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzxbg" event={"ID":"71432eac-c873-4fb2-a5ca-8e00edfc3120","Type":"ContainerStarted","Data":"59a482d2c996a745d311b0d1a18e66c64dc59f6b6c86018289ccc8923057c7fc"} Jan 28 07:38:01 crc kubenswrapper[4776]: I0128 07:38:01.112507 4776 generic.go:334] "Generic (PLEG): container finished" podID="71432eac-c873-4fb2-a5ca-8e00edfc3120" containerID="59a482d2c996a745d311b0d1a18e66c64dc59f6b6c86018289ccc8923057c7fc" exitCode=0 Jan 28 07:38:01 crc kubenswrapper[4776]: I0128 07:38:01.113965 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzxbg" event={"ID":"71432eac-c873-4fb2-a5ca-8e00edfc3120","Type":"ContainerDied","Data":"59a482d2c996a745d311b0d1a18e66c64dc59f6b6c86018289ccc8923057c7fc"} Jan 28 07:38:02 crc kubenswrapper[4776]: I0128 07:38:02.125446 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzxbg" event={"ID":"71432eac-c873-4fb2-a5ca-8e00edfc3120","Type":"ContainerStarted","Data":"7ee6e320b0c5b0eafaa57b292bbb4f291211d85c1082aed7fc0ccc19f9e3a583"} Jan 28 07:38:02 crc kubenswrapper[4776]: I0128 07:38:02.152076 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bzxbg" podStartSLOduration=3.605104934 podStartE2EDuration="13.152055839s" podCreationTimestamp="2026-01-28 07:37:49 +0000 UTC" firstStartedPulling="2026-01-28 07:37:52.006138692 +0000 UTC m=+2843.421798892" lastFinishedPulling="2026-01-28 07:38:01.553089597 +0000 UTC m=+2852.968749797" observedRunningTime="2026-01-28 07:38:02.145189324 +0000 UTC m=+2853.560849494" watchObservedRunningTime="2026-01-28 07:38:02.152055839 +0000 UTC m=+2853.567715999" Jan 28 07:38:03 crc kubenswrapper[4776]: I0128 07:38:03.981378 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:38:03 crc kubenswrapper[4776]: I0128 07:38:03.981681 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:38:10 crc kubenswrapper[4776]: I0128 07:38:10.260733 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bzxbg" Jan 28 07:38:10 crc kubenswrapper[4776]: I0128 07:38:10.261692 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bzxbg" Jan 28 07:38:11 crc kubenswrapper[4776]: I0128 07:38:11.329338 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bzxbg" podUID="71432eac-c873-4fb2-a5ca-8e00edfc3120" containerName="registry-server" probeResult="failure" output=< Jan 28 07:38:11 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Jan 28 07:38:11 crc kubenswrapper[4776]: > Jan 28 07:38:21 crc kubenswrapper[4776]: I0128 07:38:21.307305 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bzxbg" podUID="71432eac-c873-4fb2-a5ca-8e00edfc3120" containerName="registry-server" probeResult="failure" output=< Jan 28 07:38:21 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Jan 28 07:38:21 crc kubenswrapper[4776]: > Jan 28 07:38:31 crc kubenswrapper[4776]: I0128 07:38:31.308238 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bzxbg" podUID="71432eac-c873-4fb2-a5ca-8e00edfc3120" containerName="registry-server" probeResult="failure" output=< Jan 28 07:38:31 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Jan 28 07:38:31 crc kubenswrapper[4776]: > Jan 28 07:38:33 crc kubenswrapper[4776]: I0128 07:38:33.852341 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:38:33 crc kubenswrapper[4776]: I0128 07:38:33.852921 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:38:40 crc kubenswrapper[4776]: I0128 07:38:40.332616 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bzxbg" Jan 28 07:38:40 crc kubenswrapper[4776]: I0128 07:38:40.399463 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bzxbg" Jan 28 07:38:40 crc kubenswrapper[4776]: I0128 07:38:40.581389 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bzxbg"] Jan 28 07:38:41 crc kubenswrapper[4776]: I0128 07:38:41.534603 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bzxbg" podUID="71432eac-c873-4fb2-a5ca-8e00edfc3120" containerName="registry-server" containerID="cri-o://7ee6e320b0c5b0eafaa57b292bbb4f291211d85c1082aed7fc0ccc19f9e3a583" gracePeriod=2 Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.031595 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzxbg" Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.084399 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71432eac-c873-4fb2-a5ca-8e00edfc3120-catalog-content\") pod \"71432eac-c873-4fb2-a5ca-8e00edfc3120\" (UID: \"71432eac-c873-4fb2-a5ca-8e00edfc3120\") " Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.084696 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71432eac-c873-4fb2-a5ca-8e00edfc3120-utilities\") pod \"71432eac-c873-4fb2-a5ca-8e00edfc3120\" (UID: \"71432eac-c873-4fb2-a5ca-8e00edfc3120\") " Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.084797 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqqkg\" (UniqueName: \"kubernetes.io/projected/71432eac-c873-4fb2-a5ca-8e00edfc3120-kube-api-access-kqqkg\") pod \"71432eac-c873-4fb2-a5ca-8e00edfc3120\" (UID: \"71432eac-c873-4fb2-a5ca-8e00edfc3120\") " Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.092270 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71432eac-c873-4fb2-a5ca-8e00edfc3120-utilities" (OuterVolumeSpecName: "utilities") pod "71432eac-c873-4fb2-a5ca-8e00edfc3120" (UID: "71432eac-c873-4fb2-a5ca-8e00edfc3120"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.097097 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71432eac-c873-4fb2-a5ca-8e00edfc3120-kube-api-access-kqqkg" (OuterVolumeSpecName: "kube-api-access-kqqkg") pod "71432eac-c873-4fb2-a5ca-8e00edfc3120" (UID: "71432eac-c873-4fb2-a5ca-8e00edfc3120"). InnerVolumeSpecName "kube-api-access-kqqkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.187323 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71432eac-c873-4fb2-a5ca-8e00edfc3120-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.187361 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqqkg\" (UniqueName: \"kubernetes.io/projected/71432eac-c873-4fb2-a5ca-8e00edfc3120-kube-api-access-kqqkg\") on node \"crc\" DevicePath \"\"" Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.213750 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71432eac-c873-4fb2-a5ca-8e00edfc3120-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71432eac-c873-4fb2-a5ca-8e00edfc3120" (UID: "71432eac-c873-4fb2-a5ca-8e00edfc3120"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.289874 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71432eac-c873-4fb2-a5ca-8e00edfc3120-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.553342 4776 generic.go:334] "Generic (PLEG): container finished" podID="71432eac-c873-4fb2-a5ca-8e00edfc3120" containerID="7ee6e320b0c5b0eafaa57b292bbb4f291211d85c1082aed7fc0ccc19f9e3a583" exitCode=0 Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.553405 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzxbg" event={"ID":"71432eac-c873-4fb2-a5ca-8e00edfc3120","Type":"ContainerDied","Data":"7ee6e320b0c5b0eafaa57b292bbb4f291211d85c1082aed7fc0ccc19f9e3a583"} Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.553474 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzxbg" event={"ID":"71432eac-c873-4fb2-a5ca-8e00edfc3120","Type":"ContainerDied","Data":"f9d35d610bd376ddfdfd7d38f4379a5e352aae558e2ad900e9598fe062734fdb"} Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.553490 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzxbg" Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.553508 4776 scope.go:117] "RemoveContainer" containerID="7ee6e320b0c5b0eafaa57b292bbb4f291211d85c1082aed7fc0ccc19f9e3a583" Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.603174 4776 scope.go:117] "RemoveContainer" containerID="59a482d2c996a745d311b0d1a18e66c64dc59f6b6c86018289ccc8923057c7fc" Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.609882 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bzxbg"] Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.621727 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bzxbg"] Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.627895 4776 scope.go:117] "RemoveContainer" containerID="f7e9150778bdf3ae40a2033166dedf92766ff24c2c3196c6cad3dc50392fd488" Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.692322 4776 scope.go:117] "RemoveContainer" containerID="7ee6e320b0c5b0eafaa57b292bbb4f291211d85c1082aed7fc0ccc19f9e3a583" Jan 28 07:38:42 crc kubenswrapper[4776]: E0128 07:38:42.693112 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ee6e320b0c5b0eafaa57b292bbb4f291211d85c1082aed7fc0ccc19f9e3a583\": container with ID starting with 7ee6e320b0c5b0eafaa57b292bbb4f291211d85c1082aed7fc0ccc19f9e3a583 not found: ID does not exist" containerID="7ee6e320b0c5b0eafaa57b292bbb4f291211d85c1082aed7fc0ccc19f9e3a583" Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.693163 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ee6e320b0c5b0eafaa57b292bbb4f291211d85c1082aed7fc0ccc19f9e3a583"} err="failed to get container status \"7ee6e320b0c5b0eafaa57b292bbb4f291211d85c1082aed7fc0ccc19f9e3a583\": rpc error: code = NotFound desc = could not find container \"7ee6e320b0c5b0eafaa57b292bbb4f291211d85c1082aed7fc0ccc19f9e3a583\": container with ID starting with 7ee6e320b0c5b0eafaa57b292bbb4f291211d85c1082aed7fc0ccc19f9e3a583 not found: ID does not exist" Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.693194 4776 scope.go:117] "RemoveContainer" containerID="59a482d2c996a745d311b0d1a18e66c64dc59f6b6c86018289ccc8923057c7fc" Jan 28 07:38:42 crc kubenswrapper[4776]: E0128 07:38:42.694194 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59a482d2c996a745d311b0d1a18e66c64dc59f6b6c86018289ccc8923057c7fc\": container with ID starting with 59a482d2c996a745d311b0d1a18e66c64dc59f6b6c86018289ccc8923057c7fc not found: ID does not exist" containerID="59a482d2c996a745d311b0d1a18e66c64dc59f6b6c86018289ccc8923057c7fc" Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.694369 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a482d2c996a745d311b0d1a18e66c64dc59f6b6c86018289ccc8923057c7fc"} err="failed to get container status \"59a482d2c996a745d311b0d1a18e66c64dc59f6b6c86018289ccc8923057c7fc\": rpc error: code = NotFound desc = could not find container \"59a482d2c996a745d311b0d1a18e66c64dc59f6b6c86018289ccc8923057c7fc\": container with ID starting with 59a482d2c996a745d311b0d1a18e66c64dc59f6b6c86018289ccc8923057c7fc not found: ID does not exist" Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.694462 4776 scope.go:117] "RemoveContainer" containerID="f7e9150778bdf3ae40a2033166dedf92766ff24c2c3196c6cad3dc50392fd488" Jan 28 07:38:42 crc kubenswrapper[4776]: E0128 07:38:42.694985 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7e9150778bdf3ae40a2033166dedf92766ff24c2c3196c6cad3dc50392fd488\": container with ID starting with f7e9150778bdf3ae40a2033166dedf92766ff24c2c3196c6cad3dc50392fd488 not found: ID does not exist" containerID="f7e9150778bdf3ae40a2033166dedf92766ff24c2c3196c6cad3dc50392fd488" Jan 28 07:38:42 crc kubenswrapper[4776]: I0128 07:38:42.695040 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e9150778bdf3ae40a2033166dedf92766ff24c2c3196c6cad3dc50392fd488"} err="failed to get container status \"f7e9150778bdf3ae40a2033166dedf92766ff24c2c3196c6cad3dc50392fd488\": rpc error: code = NotFound desc = could not find container \"f7e9150778bdf3ae40a2033166dedf92766ff24c2c3196c6cad3dc50392fd488\": container with ID starting with f7e9150778bdf3ae40a2033166dedf92766ff24c2c3196c6cad3dc50392fd488 not found: ID does not exist" Jan 28 07:38:43 crc kubenswrapper[4776]: I0128 07:38:43.319286 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71432eac-c873-4fb2-a5ca-8e00edfc3120" path="/var/lib/kubelet/pods/71432eac-c873-4fb2-a5ca-8e00edfc3120/volumes" Jan 28 07:39:03 crc kubenswrapper[4776]: I0128 07:39:03.852037 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:39:03 crc kubenswrapper[4776]: I0128 07:39:03.852657 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:39:03 crc kubenswrapper[4776]: I0128 07:39:03.852719 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 07:39:03 crc kubenswrapper[4776]: I0128 07:39:03.853787 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fed86bb4db91dc0975b599ff4c252d854cf12a83f95685d05a3e121b9858944a"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 07:39:03 crc kubenswrapper[4776]: I0128 07:39:03.853886 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://fed86bb4db91dc0975b599ff4c252d854cf12a83f95685d05a3e121b9858944a" gracePeriod=600 Jan 28 07:39:04 crc kubenswrapper[4776]: I0128 07:39:04.783890 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="fed86bb4db91dc0975b599ff4c252d854cf12a83f95685d05a3e121b9858944a" exitCode=0 Jan 28 07:39:04 crc kubenswrapper[4776]: I0128 07:39:04.783970 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"fed86bb4db91dc0975b599ff4c252d854cf12a83f95685d05a3e121b9858944a"} Jan 28 07:39:04 crc kubenswrapper[4776]: I0128 07:39:04.784433 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0"} Jan 28 07:39:04 crc kubenswrapper[4776]: I0128 07:39:04.784458 4776 scope.go:117] "RemoveContainer" containerID="f12f1206c10741f1aa3dd6ea2eec3d34efef2a055803a3f85d0c045ffdf3275e" Jan 28 07:41:33 crc kubenswrapper[4776]: I0128 07:41:33.851616 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:41:33 crc kubenswrapper[4776]: I0128 07:41:33.852397 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:42:03 crc kubenswrapper[4776]: I0128 07:42:03.852441 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:42:03 crc kubenswrapper[4776]: I0128 07:42:03.853169 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:42:17 crc kubenswrapper[4776]: I0128 07:42:17.945323 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wrr59"] Jan 28 07:42:17 crc kubenswrapper[4776]: E0128 07:42:17.946432 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71432eac-c873-4fb2-a5ca-8e00edfc3120" containerName="extract-utilities" Jan 28 07:42:17 crc kubenswrapper[4776]: I0128 07:42:17.946450 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="71432eac-c873-4fb2-a5ca-8e00edfc3120" containerName="extract-utilities" Jan 28 07:42:17 crc kubenswrapper[4776]: E0128 07:42:17.946493 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71432eac-c873-4fb2-a5ca-8e00edfc3120" containerName="registry-server" Jan 28 07:42:17 crc kubenswrapper[4776]: I0128 07:42:17.946502 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="71432eac-c873-4fb2-a5ca-8e00edfc3120" containerName="registry-server" Jan 28 07:42:17 crc kubenswrapper[4776]: E0128 07:42:17.946530 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71432eac-c873-4fb2-a5ca-8e00edfc3120" containerName="extract-content" Jan 28 07:42:17 crc kubenswrapper[4776]: I0128 07:42:17.946538 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="71432eac-c873-4fb2-a5ca-8e00edfc3120" containerName="extract-content" Jan 28 07:42:17 crc kubenswrapper[4776]: I0128 07:42:17.946811 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="71432eac-c873-4fb2-a5ca-8e00edfc3120" containerName="registry-server" Jan 28 07:42:17 crc kubenswrapper[4776]: I0128 07:42:17.948586 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrr59" Jan 28 07:42:17 crc kubenswrapper[4776]: I0128 07:42:17.956582 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrr59"] Jan 28 07:42:18 crc kubenswrapper[4776]: I0128 07:42:18.073183 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f673bee7-0f34-47de-a996-af53ff140c91-catalog-content\") pod \"redhat-marketplace-wrr59\" (UID: \"f673bee7-0f34-47de-a996-af53ff140c91\") " pod="openshift-marketplace/redhat-marketplace-wrr59" Jan 28 07:42:18 crc kubenswrapper[4776]: I0128 07:42:18.073269 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-264jq\" (UniqueName: \"kubernetes.io/projected/f673bee7-0f34-47de-a996-af53ff140c91-kube-api-access-264jq\") pod \"redhat-marketplace-wrr59\" (UID: \"f673bee7-0f34-47de-a996-af53ff140c91\") " pod="openshift-marketplace/redhat-marketplace-wrr59" Jan 28 07:42:18 crc kubenswrapper[4776]: I0128 07:42:18.073411 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f673bee7-0f34-47de-a996-af53ff140c91-utilities\") pod \"redhat-marketplace-wrr59\" (UID: \"f673bee7-0f34-47de-a996-af53ff140c91\") " pod="openshift-marketplace/redhat-marketplace-wrr59" Jan 28 07:42:18 crc kubenswrapper[4776]: I0128 07:42:18.175013 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f673bee7-0f34-47de-a996-af53ff140c91-utilities\") pod \"redhat-marketplace-wrr59\" (UID: \"f673bee7-0f34-47de-a996-af53ff140c91\") " pod="openshift-marketplace/redhat-marketplace-wrr59" Jan 28 07:42:18 crc kubenswrapper[4776]: I0128 07:42:18.175393 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f673bee7-0f34-47de-a996-af53ff140c91-catalog-content\") pod \"redhat-marketplace-wrr59\" (UID: \"f673bee7-0f34-47de-a996-af53ff140c91\") " pod="openshift-marketplace/redhat-marketplace-wrr59" Jan 28 07:42:18 crc kubenswrapper[4776]: I0128 07:42:18.175442 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-264jq\" (UniqueName: \"kubernetes.io/projected/f673bee7-0f34-47de-a996-af53ff140c91-kube-api-access-264jq\") pod \"redhat-marketplace-wrr59\" (UID: \"f673bee7-0f34-47de-a996-af53ff140c91\") " pod="openshift-marketplace/redhat-marketplace-wrr59" Jan 28 07:42:18 crc kubenswrapper[4776]: I0128 07:42:18.176194 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f673bee7-0f34-47de-a996-af53ff140c91-utilities\") pod \"redhat-marketplace-wrr59\" (UID: \"f673bee7-0f34-47de-a996-af53ff140c91\") " pod="openshift-marketplace/redhat-marketplace-wrr59" Jan 28 07:42:18 crc kubenswrapper[4776]: I0128 07:42:18.176414 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f673bee7-0f34-47de-a996-af53ff140c91-catalog-content\") pod \"redhat-marketplace-wrr59\" (UID: \"f673bee7-0f34-47de-a996-af53ff140c91\") " pod="openshift-marketplace/redhat-marketplace-wrr59" Jan 28 07:42:18 crc kubenswrapper[4776]: I0128 07:42:18.199915 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-264jq\" (UniqueName: \"kubernetes.io/projected/f673bee7-0f34-47de-a996-af53ff140c91-kube-api-access-264jq\") pod \"redhat-marketplace-wrr59\" (UID: \"f673bee7-0f34-47de-a996-af53ff140c91\") " pod="openshift-marketplace/redhat-marketplace-wrr59" Jan 28 07:42:18 crc kubenswrapper[4776]: I0128 07:42:18.270368 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrr59" Jan 28 07:42:18 crc kubenswrapper[4776]: I0128 07:42:18.743884 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrr59"] Jan 28 07:42:18 crc kubenswrapper[4776]: W0128 07:42:18.748343 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf673bee7_0f34_47de_a996_af53ff140c91.slice/crio-f621531162113aa0bf60207c109e4867b5e66fb81e873d10967a39d3557ea449 WatchSource:0}: Error finding container f621531162113aa0bf60207c109e4867b5e66fb81e873d10967a39d3557ea449: Status 404 returned error can't find the container with id f621531162113aa0bf60207c109e4867b5e66fb81e873d10967a39d3557ea449 Jan 28 07:42:18 crc kubenswrapper[4776]: I0128 07:42:18.872812 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrr59" event={"ID":"f673bee7-0f34-47de-a996-af53ff140c91","Type":"ContainerStarted","Data":"f621531162113aa0bf60207c109e4867b5e66fb81e873d10967a39d3557ea449"} Jan 28 07:42:19 crc kubenswrapper[4776]: I0128 07:42:19.893813 4776 generic.go:334] "Generic (PLEG): container finished" podID="f673bee7-0f34-47de-a996-af53ff140c91" containerID="a7bf94f45bad9da32aaca31fbeed4f6c714407f854d9fb8daec8714ff1fe346f" exitCode=0 Jan 28 07:42:19 crc kubenswrapper[4776]: I0128 07:42:19.894193 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrr59" event={"ID":"f673bee7-0f34-47de-a996-af53ff140c91","Type":"ContainerDied","Data":"a7bf94f45bad9da32aaca31fbeed4f6c714407f854d9fb8daec8714ff1fe346f"} Jan 28 07:42:19 crc kubenswrapper[4776]: I0128 07:42:19.904271 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 07:42:20 crc kubenswrapper[4776]: I0128 07:42:20.915403 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrr59" event={"ID":"f673bee7-0f34-47de-a996-af53ff140c91","Type":"ContainerStarted","Data":"45cdbeb18e9c21671972c2bcd03b757b21ec4dc13739b17482133adc2eca8181"} Jan 28 07:42:21 crc kubenswrapper[4776]: I0128 07:42:21.926397 4776 generic.go:334] "Generic (PLEG): container finished" podID="f673bee7-0f34-47de-a996-af53ff140c91" containerID="45cdbeb18e9c21671972c2bcd03b757b21ec4dc13739b17482133adc2eca8181" exitCode=0 Jan 28 07:42:21 crc kubenswrapper[4776]: I0128 07:42:21.926463 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrr59" event={"ID":"f673bee7-0f34-47de-a996-af53ff140c91","Type":"ContainerDied","Data":"45cdbeb18e9c21671972c2bcd03b757b21ec4dc13739b17482133adc2eca8181"} Jan 28 07:42:22 crc kubenswrapper[4776]: I0128 07:42:22.939037 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrr59" event={"ID":"f673bee7-0f34-47de-a996-af53ff140c91","Type":"ContainerStarted","Data":"4e885f15ecbc3f088da0c9574482f8a9511bff33dce20b72cc10de1b152afb86"} Jan 28 07:42:22 crc kubenswrapper[4776]: I0128 07:42:22.970086 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wrr59" podStartSLOduration=3.535713385 podStartE2EDuration="5.970065047s" podCreationTimestamp="2026-01-28 07:42:17 +0000 UTC" firstStartedPulling="2026-01-28 07:42:19.901730377 +0000 UTC m=+3111.317390547" lastFinishedPulling="2026-01-28 07:42:22.336082049 +0000 UTC m=+3113.751742209" observedRunningTime="2026-01-28 07:42:22.956298615 +0000 UTC m=+3114.371958795" watchObservedRunningTime="2026-01-28 07:42:22.970065047 +0000 UTC m=+3114.385725217" Jan 28 07:42:28 crc kubenswrapper[4776]: I0128 07:42:28.270630 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wrr59" Jan 28 07:42:28 crc kubenswrapper[4776]: I0128 07:42:28.271159 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wrr59" Jan 28 07:42:28 crc kubenswrapper[4776]: I0128 07:42:28.317787 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wrr59" Jan 28 07:42:29 crc kubenswrapper[4776]: I0128 07:42:29.042451 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wrr59" Jan 28 07:42:29 crc kubenswrapper[4776]: I0128 07:42:29.103813 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrr59"] Jan 28 07:42:31 crc kubenswrapper[4776]: I0128 07:42:31.012462 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wrr59" podUID="f673bee7-0f34-47de-a996-af53ff140c91" containerName="registry-server" containerID="cri-o://4e885f15ecbc3f088da0c9574482f8a9511bff33dce20b72cc10de1b152afb86" gracePeriod=2 Jan 28 07:42:31 crc kubenswrapper[4776]: I0128 07:42:31.599893 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrr59" Jan 28 07:42:31 crc kubenswrapper[4776]: I0128 07:42:31.773758 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f673bee7-0f34-47de-a996-af53ff140c91-utilities\") pod \"f673bee7-0f34-47de-a996-af53ff140c91\" (UID: \"f673bee7-0f34-47de-a996-af53ff140c91\") " Jan 28 07:42:31 crc kubenswrapper[4776]: I0128 07:42:31.773877 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-264jq\" (UniqueName: \"kubernetes.io/projected/f673bee7-0f34-47de-a996-af53ff140c91-kube-api-access-264jq\") pod \"f673bee7-0f34-47de-a996-af53ff140c91\" (UID: \"f673bee7-0f34-47de-a996-af53ff140c91\") " Jan 28 07:42:31 crc kubenswrapper[4776]: I0128 07:42:31.774112 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f673bee7-0f34-47de-a996-af53ff140c91-catalog-content\") pod \"f673bee7-0f34-47de-a996-af53ff140c91\" (UID: \"f673bee7-0f34-47de-a996-af53ff140c91\") " Jan 28 07:42:31 crc kubenswrapper[4776]: I0128 07:42:31.775202 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f673bee7-0f34-47de-a996-af53ff140c91-utilities" (OuterVolumeSpecName: "utilities") pod "f673bee7-0f34-47de-a996-af53ff140c91" (UID: "f673bee7-0f34-47de-a996-af53ff140c91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:42:31 crc kubenswrapper[4776]: I0128 07:42:31.780029 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f673bee7-0f34-47de-a996-af53ff140c91-kube-api-access-264jq" (OuterVolumeSpecName: "kube-api-access-264jq") pod "f673bee7-0f34-47de-a996-af53ff140c91" (UID: "f673bee7-0f34-47de-a996-af53ff140c91"). InnerVolumeSpecName "kube-api-access-264jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:42:31 crc kubenswrapper[4776]: I0128 07:42:31.799394 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f673bee7-0f34-47de-a996-af53ff140c91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f673bee7-0f34-47de-a996-af53ff140c91" (UID: "f673bee7-0f34-47de-a996-af53ff140c91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:42:31 crc kubenswrapper[4776]: I0128 07:42:31.876138 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f673bee7-0f34-47de-a996-af53ff140c91-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:42:31 crc kubenswrapper[4776]: I0128 07:42:31.876476 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f673bee7-0f34-47de-a996-af53ff140c91-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:42:31 crc kubenswrapper[4776]: I0128 07:42:31.876487 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-264jq\" (UniqueName: \"kubernetes.io/projected/f673bee7-0f34-47de-a996-af53ff140c91-kube-api-access-264jq\") on node \"crc\" DevicePath \"\"" Jan 28 07:42:32 crc kubenswrapper[4776]: I0128 07:42:32.028025 4776 generic.go:334] "Generic (PLEG): container finished" podID="f673bee7-0f34-47de-a996-af53ff140c91" containerID="4e885f15ecbc3f088da0c9574482f8a9511bff33dce20b72cc10de1b152afb86" exitCode=0 Jan 28 07:42:32 crc kubenswrapper[4776]: I0128 07:42:32.028119 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrr59" event={"ID":"f673bee7-0f34-47de-a996-af53ff140c91","Type":"ContainerDied","Data":"4e885f15ecbc3f088da0c9574482f8a9511bff33dce20b72cc10de1b152afb86"} Jan 28 07:42:32 crc kubenswrapper[4776]: I0128 07:42:32.028161 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrr59" event={"ID":"f673bee7-0f34-47de-a996-af53ff140c91","Type":"ContainerDied","Data":"f621531162113aa0bf60207c109e4867b5e66fb81e873d10967a39d3557ea449"} Jan 28 07:42:32 crc kubenswrapper[4776]: I0128 07:42:32.028159 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrr59" Jan 28 07:42:32 crc kubenswrapper[4776]: I0128 07:42:32.028180 4776 scope.go:117] "RemoveContainer" containerID="4e885f15ecbc3f088da0c9574482f8a9511bff33dce20b72cc10de1b152afb86" Jan 28 07:42:32 crc kubenswrapper[4776]: I0128 07:42:32.060900 4776 scope.go:117] "RemoveContainer" containerID="45cdbeb18e9c21671972c2bcd03b757b21ec4dc13739b17482133adc2eca8181" Jan 28 07:42:32 crc kubenswrapper[4776]: I0128 07:42:32.083864 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrr59"] Jan 28 07:42:32 crc kubenswrapper[4776]: I0128 07:42:32.095843 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrr59"] Jan 28 07:42:32 crc kubenswrapper[4776]: I0128 07:42:32.103465 4776 scope.go:117] "RemoveContainer" containerID="a7bf94f45bad9da32aaca31fbeed4f6c714407f854d9fb8daec8714ff1fe346f" Jan 28 07:42:32 crc kubenswrapper[4776]: I0128 07:42:32.139206 4776 scope.go:117] "RemoveContainer" containerID="4e885f15ecbc3f088da0c9574482f8a9511bff33dce20b72cc10de1b152afb86" Jan 28 07:42:32 crc kubenswrapper[4776]: E0128 07:42:32.139821 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e885f15ecbc3f088da0c9574482f8a9511bff33dce20b72cc10de1b152afb86\": container with ID starting with 4e885f15ecbc3f088da0c9574482f8a9511bff33dce20b72cc10de1b152afb86 not found: ID does not exist" containerID="4e885f15ecbc3f088da0c9574482f8a9511bff33dce20b72cc10de1b152afb86" Jan 28 07:42:32 crc kubenswrapper[4776]: I0128 07:42:32.139910 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e885f15ecbc3f088da0c9574482f8a9511bff33dce20b72cc10de1b152afb86"} err="failed to get container status \"4e885f15ecbc3f088da0c9574482f8a9511bff33dce20b72cc10de1b152afb86\": rpc error: code = NotFound desc = could not find container \"4e885f15ecbc3f088da0c9574482f8a9511bff33dce20b72cc10de1b152afb86\": container with ID starting with 4e885f15ecbc3f088da0c9574482f8a9511bff33dce20b72cc10de1b152afb86 not found: ID does not exist" Jan 28 07:42:32 crc kubenswrapper[4776]: I0128 07:42:32.140025 4776 scope.go:117] "RemoveContainer" containerID="45cdbeb18e9c21671972c2bcd03b757b21ec4dc13739b17482133adc2eca8181" Jan 28 07:42:32 crc kubenswrapper[4776]: E0128 07:42:32.140410 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45cdbeb18e9c21671972c2bcd03b757b21ec4dc13739b17482133adc2eca8181\": container with ID starting with 45cdbeb18e9c21671972c2bcd03b757b21ec4dc13739b17482133adc2eca8181 not found: ID does not exist" containerID="45cdbeb18e9c21671972c2bcd03b757b21ec4dc13739b17482133adc2eca8181" Jan 28 07:42:32 crc kubenswrapper[4776]: I0128 07:42:32.140500 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45cdbeb18e9c21671972c2bcd03b757b21ec4dc13739b17482133adc2eca8181"} err="failed to get container status \"45cdbeb18e9c21671972c2bcd03b757b21ec4dc13739b17482133adc2eca8181\": rpc error: code = NotFound desc = could not find container \"45cdbeb18e9c21671972c2bcd03b757b21ec4dc13739b17482133adc2eca8181\": container with ID starting with 45cdbeb18e9c21671972c2bcd03b757b21ec4dc13739b17482133adc2eca8181 not found: ID does not exist" Jan 28 07:42:32 crc kubenswrapper[4776]: I0128 07:42:32.140611 4776 scope.go:117] "RemoveContainer" containerID="a7bf94f45bad9da32aaca31fbeed4f6c714407f854d9fb8daec8714ff1fe346f" Jan 28 07:42:32 crc kubenswrapper[4776]: E0128 07:42:32.140998 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7bf94f45bad9da32aaca31fbeed4f6c714407f854d9fb8daec8714ff1fe346f\": container with ID starting with a7bf94f45bad9da32aaca31fbeed4f6c714407f854d9fb8daec8714ff1fe346f not found: ID does not exist" containerID="a7bf94f45bad9da32aaca31fbeed4f6c714407f854d9fb8daec8714ff1fe346f" Jan 28 07:42:32 crc kubenswrapper[4776]: I0128 07:42:32.141066 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7bf94f45bad9da32aaca31fbeed4f6c714407f854d9fb8daec8714ff1fe346f"} err="failed to get container status \"a7bf94f45bad9da32aaca31fbeed4f6c714407f854d9fb8daec8714ff1fe346f\": rpc error: code = NotFound desc = could not find container \"a7bf94f45bad9da32aaca31fbeed4f6c714407f854d9fb8daec8714ff1fe346f\": container with ID starting with a7bf94f45bad9da32aaca31fbeed4f6c714407f854d9fb8daec8714ff1fe346f not found: ID does not exist" Jan 28 07:42:33 crc kubenswrapper[4776]: I0128 07:42:33.325339 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f673bee7-0f34-47de-a996-af53ff140c91" path="/var/lib/kubelet/pods/f673bee7-0f34-47de-a996-af53ff140c91/volumes" Jan 28 07:42:33 crc kubenswrapper[4776]: I0128 07:42:33.851991 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:42:33 crc kubenswrapper[4776]: I0128 07:42:33.852044 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:42:33 crc kubenswrapper[4776]: I0128 07:42:33.852083 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 07:42:33 crc kubenswrapper[4776]: I0128 07:42:33.852831 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 07:42:33 crc kubenswrapper[4776]: I0128 07:42:33.852901 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" gracePeriod=600 Jan 28 07:42:33 crc kubenswrapper[4776]: E0128 07:42:33.989574 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:42:34 crc kubenswrapper[4776]: I0128 07:42:34.055859 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" exitCode=0 Jan 28 07:42:34 crc kubenswrapper[4776]: I0128 07:42:34.055916 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0"} Jan 28 07:42:34 crc kubenswrapper[4776]: I0128 07:42:34.055954 4776 scope.go:117] "RemoveContainer" containerID="fed86bb4db91dc0975b599ff4c252d854cf12a83f95685d05a3e121b9858944a" Jan 28 07:42:34 crc kubenswrapper[4776]: I0128 07:42:34.056754 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:42:34 crc kubenswrapper[4776]: E0128 07:42:34.057059 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:42:46 crc kubenswrapper[4776]: I0128 07:42:46.305388 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:42:46 crc kubenswrapper[4776]: E0128 07:42:46.306356 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:43:01 crc kubenswrapper[4776]: I0128 07:43:01.304744 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:43:01 crc kubenswrapper[4776]: E0128 07:43:01.305631 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:43:16 crc kubenswrapper[4776]: I0128 07:43:16.305443 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:43:16 crc kubenswrapper[4776]: E0128 07:43:16.306851 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:43:27 crc kubenswrapper[4776]: I0128 07:43:27.306728 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:43:27 crc kubenswrapper[4776]: E0128 07:43:27.307493 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:43:38 crc kubenswrapper[4776]: I0128 07:43:38.305327 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:43:38 crc kubenswrapper[4776]: E0128 07:43:38.306336 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:43:49 crc kubenswrapper[4776]: I0128 07:43:49.313113 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:43:49 crc kubenswrapper[4776]: E0128 07:43:49.314079 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:44:03 crc kubenswrapper[4776]: I0128 07:44:03.304895 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:44:03 crc kubenswrapper[4776]: E0128 07:44:03.305857 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:44:14 crc kubenswrapper[4776]: I0128 07:44:14.305376 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:44:14 crc kubenswrapper[4776]: E0128 07:44:14.307256 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:44:28 crc kubenswrapper[4776]: I0128 07:44:28.305282 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:44:28 crc kubenswrapper[4776]: E0128 07:44:28.307439 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:44:43 crc kubenswrapper[4776]: I0128 07:44:43.305256 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:44:43 crc kubenswrapper[4776]: E0128 07:44:43.307027 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:44:57 crc kubenswrapper[4776]: I0128 07:44:57.306650 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:44:57 crc kubenswrapper[4776]: E0128 07:44:57.307468 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.149865 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd"] Jan 28 07:45:00 crc kubenswrapper[4776]: E0128 07:45:00.150591 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f673bee7-0f34-47de-a996-af53ff140c91" containerName="extract-utilities" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.150604 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f673bee7-0f34-47de-a996-af53ff140c91" containerName="extract-utilities" Jan 28 07:45:00 crc kubenswrapper[4776]: E0128 07:45:00.150629 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f673bee7-0f34-47de-a996-af53ff140c91" containerName="extract-content" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.150635 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f673bee7-0f34-47de-a996-af53ff140c91" containerName="extract-content" Jan 28 07:45:00 crc kubenswrapper[4776]: E0128 07:45:00.150650 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f673bee7-0f34-47de-a996-af53ff140c91" containerName="registry-server" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.150656 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f673bee7-0f34-47de-a996-af53ff140c91" containerName="registry-server" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.150857 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f673bee7-0f34-47de-a996-af53ff140c91" containerName="registry-server" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.151486 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.154450 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.155190 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.165316 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd"] Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.350628 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1521a499-587e-4bc9-86f4-a572e4e238cd-secret-volume\") pod \"collect-profiles-29493105-c69pd\" (UID: \"1521a499-587e-4bc9-86f4-a572e4e238cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.350690 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1521a499-587e-4bc9-86f4-a572e4e238cd-config-volume\") pod \"collect-profiles-29493105-c69pd\" (UID: \"1521a499-587e-4bc9-86f4-a572e4e238cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.350960 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lg4p\" (UniqueName: \"kubernetes.io/projected/1521a499-587e-4bc9-86f4-a572e4e238cd-kube-api-access-4lg4p\") pod \"collect-profiles-29493105-c69pd\" (UID: \"1521a499-587e-4bc9-86f4-a572e4e238cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.452588 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lg4p\" (UniqueName: \"kubernetes.io/projected/1521a499-587e-4bc9-86f4-a572e4e238cd-kube-api-access-4lg4p\") pod \"collect-profiles-29493105-c69pd\" (UID: \"1521a499-587e-4bc9-86f4-a572e4e238cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.452669 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1521a499-587e-4bc9-86f4-a572e4e238cd-secret-volume\") pod \"collect-profiles-29493105-c69pd\" (UID: \"1521a499-587e-4bc9-86f4-a572e4e238cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.452707 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1521a499-587e-4bc9-86f4-a572e4e238cd-config-volume\") pod \"collect-profiles-29493105-c69pd\" (UID: \"1521a499-587e-4bc9-86f4-a572e4e238cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.453624 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1521a499-587e-4bc9-86f4-a572e4e238cd-config-volume\") pod \"collect-profiles-29493105-c69pd\" (UID: \"1521a499-587e-4bc9-86f4-a572e4e238cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.460426 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1521a499-587e-4bc9-86f4-a572e4e238cd-secret-volume\") pod \"collect-profiles-29493105-c69pd\" (UID: \"1521a499-587e-4bc9-86f4-a572e4e238cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.486984 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lg4p\" (UniqueName: \"kubernetes.io/projected/1521a499-587e-4bc9-86f4-a572e4e238cd-kube-api-access-4lg4p\") pod \"collect-profiles-29493105-c69pd\" (UID: \"1521a499-587e-4bc9-86f4-a572e4e238cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.490094 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd" Jan 28 07:45:00 crc kubenswrapper[4776]: I0128 07:45:00.955925 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd"] Jan 28 07:45:01 crc kubenswrapper[4776]: I0128 07:45:01.618918 4776 generic.go:334] "Generic (PLEG): container finished" podID="1521a499-587e-4bc9-86f4-a572e4e238cd" containerID="97111fc25e3e34ce5a74d5ace2d9bb564d4c66edc8106234018d1dcc3f2358c6" exitCode=0 Jan 28 07:45:01 crc kubenswrapper[4776]: I0128 07:45:01.619007 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd" event={"ID":"1521a499-587e-4bc9-86f4-a572e4e238cd","Type":"ContainerDied","Data":"97111fc25e3e34ce5a74d5ace2d9bb564d4c66edc8106234018d1dcc3f2358c6"} Jan 28 07:45:01 crc kubenswrapper[4776]: I0128 07:45:01.619204 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd" event={"ID":"1521a499-587e-4bc9-86f4-a572e4e238cd","Type":"ContainerStarted","Data":"6f2e2227faac5525760f2b6937da531cfa7fdd0b257fd7b72d565b113a381e7b"} Jan 28 07:45:03 crc kubenswrapper[4776]: I0128 07:45:03.029208 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd" Jan 28 07:45:03 crc kubenswrapper[4776]: I0128 07:45:03.207427 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1521a499-587e-4bc9-86f4-a572e4e238cd-config-volume\") pod \"1521a499-587e-4bc9-86f4-a572e4e238cd\" (UID: \"1521a499-587e-4bc9-86f4-a572e4e238cd\") " Jan 28 07:45:03 crc kubenswrapper[4776]: I0128 07:45:03.207670 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lg4p\" (UniqueName: \"kubernetes.io/projected/1521a499-587e-4bc9-86f4-a572e4e238cd-kube-api-access-4lg4p\") pod \"1521a499-587e-4bc9-86f4-a572e4e238cd\" (UID: \"1521a499-587e-4bc9-86f4-a572e4e238cd\") " Jan 28 07:45:03 crc kubenswrapper[4776]: I0128 07:45:03.207743 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1521a499-587e-4bc9-86f4-a572e4e238cd-secret-volume\") pod \"1521a499-587e-4bc9-86f4-a572e4e238cd\" (UID: \"1521a499-587e-4bc9-86f4-a572e4e238cd\") " Jan 28 07:45:03 crc kubenswrapper[4776]: I0128 07:45:03.207939 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1521a499-587e-4bc9-86f4-a572e4e238cd-config-volume" (OuterVolumeSpecName: "config-volume") pod "1521a499-587e-4bc9-86f4-a572e4e238cd" (UID: "1521a499-587e-4bc9-86f4-a572e4e238cd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 07:45:03 crc kubenswrapper[4776]: I0128 07:45:03.208235 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1521a499-587e-4bc9-86f4-a572e4e238cd-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 07:45:03 crc kubenswrapper[4776]: I0128 07:45:03.214932 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1521a499-587e-4bc9-86f4-a572e4e238cd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1521a499-587e-4bc9-86f4-a572e4e238cd" (UID: "1521a499-587e-4bc9-86f4-a572e4e238cd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 07:45:03 crc kubenswrapper[4776]: I0128 07:45:03.215212 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1521a499-587e-4bc9-86f4-a572e4e238cd-kube-api-access-4lg4p" (OuterVolumeSpecName: "kube-api-access-4lg4p") pod "1521a499-587e-4bc9-86f4-a572e4e238cd" (UID: "1521a499-587e-4bc9-86f4-a572e4e238cd"). InnerVolumeSpecName "kube-api-access-4lg4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:45:03 crc kubenswrapper[4776]: I0128 07:45:03.311378 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1521a499-587e-4bc9-86f4-a572e4e238cd-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 07:45:03 crc kubenswrapper[4776]: I0128 07:45:03.311425 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lg4p\" (UniqueName: \"kubernetes.io/projected/1521a499-587e-4bc9-86f4-a572e4e238cd-kube-api-access-4lg4p\") on node \"crc\" DevicePath \"\"" Jan 28 07:45:03 crc kubenswrapper[4776]: I0128 07:45:03.637270 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd" event={"ID":"1521a499-587e-4bc9-86f4-a572e4e238cd","Type":"ContainerDied","Data":"6f2e2227faac5525760f2b6937da531cfa7fdd0b257fd7b72d565b113a381e7b"} Jan 28 07:45:03 crc kubenswrapper[4776]: I0128 07:45:03.637623 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f2e2227faac5525760f2b6937da531cfa7fdd0b257fd7b72d565b113a381e7b" Jan 28 07:45:03 crc kubenswrapper[4776]: I0128 07:45:03.637378 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd" Jan 28 07:45:04 crc kubenswrapper[4776]: I0128 07:45:04.123528 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x"] Jan 28 07:45:04 crc kubenswrapper[4776]: I0128 07:45:04.132661 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493060-2px6x"] Jan 28 07:45:05 crc kubenswrapper[4776]: I0128 07:45:05.321838 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b42b4d1-1d09-49c6-bbb9-e4c0370554c0" path="/var/lib/kubelet/pods/2b42b4d1-1d09-49c6-bbb9-e4c0370554c0/volumes" Jan 28 07:45:12 crc kubenswrapper[4776]: I0128 07:45:12.305388 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:45:12 crc kubenswrapper[4776]: E0128 07:45:12.306244 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:45:25 crc kubenswrapper[4776]: I0128 07:45:25.354883 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zwzzz"] Jan 28 07:45:25 crc kubenswrapper[4776]: E0128 07:45:25.361885 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1521a499-587e-4bc9-86f4-a572e4e238cd" containerName="collect-profiles" Jan 28 07:45:25 crc kubenswrapper[4776]: I0128 07:45:25.361929 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1521a499-587e-4bc9-86f4-a572e4e238cd" containerName="collect-profiles" Jan 28 07:45:25 crc kubenswrapper[4776]: I0128 07:45:25.362242 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1521a499-587e-4bc9-86f4-a572e4e238cd" containerName="collect-profiles" Jan 28 07:45:25 crc kubenswrapper[4776]: I0128 07:45:25.364153 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwzzz" Jan 28 07:45:25 crc kubenswrapper[4776]: I0128 07:45:25.377000 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwzzz"] Jan 28 07:45:25 crc kubenswrapper[4776]: I0128 07:45:25.455740 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-utilities\") pod \"community-operators-zwzzz\" (UID: \"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2\") " pod="openshift-marketplace/community-operators-zwzzz" Jan 28 07:45:25 crc kubenswrapper[4776]: I0128 07:45:25.455890 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kzgj\" (UniqueName: \"kubernetes.io/projected/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-kube-api-access-2kzgj\") pod \"community-operators-zwzzz\" (UID: \"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2\") " pod="openshift-marketplace/community-operators-zwzzz" Jan 28 07:45:25 crc kubenswrapper[4776]: I0128 07:45:25.455995 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-catalog-content\") pod \"community-operators-zwzzz\" (UID: \"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2\") " pod="openshift-marketplace/community-operators-zwzzz" Jan 28 07:45:25 crc kubenswrapper[4776]: I0128 07:45:25.558121 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-utilities\") pod \"community-operators-zwzzz\" (UID: \"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2\") " pod="openshift-marketplace/community-operators-zwzzz" Jan 28 07:45:25 crc kubenswrapper[4776]: I0128 07:45:25.558215 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kzgj\" (UniqueName: \"kubernetes.io/projected/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-kube-api-access-2kzgj\") pod \"community-operators-zwzzz\" (UID: \"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2\") " pod="openshift-marketplace/community-operators-zwzzz" Jan 28 07:45:25 crc kubenswrapper[4776]: I0128 07:45:25.558306 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-catalog-content\") pod \"community-operators-zwzzz\" (UID: \"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2\") " pod="openshift-marketplace/community-operators-zwzzz" Jan 28 07:45:25 crc kubenswrapper[4776]: I0128 07:45:25.558732 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-utilities\") pod \"community-operators-zwzzz\" (UID: \"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2\") " pod="openshift-marketplace/community-operators-zwzzz" Jan 28 07:45:25 crc kubenswrapper[4776]: I0128 07:45:25.558758 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-catalog-content\") pod \"community-operators-zwzzz\" (UID: \"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2\") " pod="openshift-marketplace/community-operators-zwzzz" Jan 28 07:45:25 crc kubenswrapper[4776]: I0128 07:45:25.577281 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kzgj\" (UniqueName: \"kubernetes.io/projected/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-kube-api-access-2kzgj\") pod \"community-operators-zwzzz\" (UID: \"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2\") " pod="openshift-marketplace/community-operators-zwzzz" Jan 28 07:45:25 crc kubenswrapper[4776]: I0128 07:45:25.700316 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwzzz" Jan 28 07:45:26 crc kubenswrapper[4776]: I0128 07:45:26.279683 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwzzz"] Jan 28 07:45:26 crc kubenswrapper[4776]: I0128 07:45:26.306127 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:45:26 crc kubenswrapper[4776]: E0128 07:45:26.306575 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:45:26 crc kubenswrapper[4776]: I0128 07:45:26.906611 4776 generic.go:334] "Generic (PLEG): container finished" podID="9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2" containerID="16c7065b97c2452eb768081c758a46528a19f964cc5fd7f6f71f28bfe18343dd" exitCode=0 Jan 28 07:45:26 crc kubenswrapper[4776]: I0128 07:45:26.906727 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwzzz" event={"ID":"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2","Type":"ContainerDied","Data":"16c7065b97c2452eb768081c758a46528a19f964cc5fd7f6f71f28bfe18343dd"} Jan 28 07:45:26 crc kubenswrapper[4776]: I0128 07:45:26.906944 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwzzz" event={"ID":"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2","Type":"ContainerStarted","Data":"5ee9f2fadfb8c06c3c0409debe65958c26f1a1acd23bb08575089dc36ba523c4"} Jan 28 07:45:27 crc kubenswrapper[4776]: I0128 07:45:27.919752 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwzzz" event={"ID":"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2","Type":"ContainerStarted","Data":"23e4d7f2b37aa114cbbd8e8c33a8cc0636645c9c2981268ef894a16e13f67100"} Jan 28 07:45:28 crc kubenswrapper[4776]: I0128 07:45:28.934562 4776 generic.go:334] "Generic (PLEG): container finished" podID="9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2" containerID="23e4d7f2b37aa114cbbd8e8c33a8cc0636645c9c2981268ef894a16e13f67100" exitCode=0 Jan 28 07:45:28 crc kubenswrapper[4776]: I0128 07:45:28.934629 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwzzz" event={"ID":"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2","Type":"ContainerDied","Data":"23e4d7f2b37aa114cbbd8e8c33a8cc0636645c9c2981268ef894a16e13f67100"} Jan 28 07:45:29 crc kubenswrapper[4776]: I0128 07:45:29.948860 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwzzz" event={"ID":"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2","Type":"ContainerStarted","Data":"863efc6ae867102a998f96c23cc0b5a3b361afb4b98b15ec3336ffcacc321829"} Jan 28 07:45:29 crc kubenswrapper[4776]: I0128 07:45:29.975214 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zwzzz" podStartSLOduration=2.525082278 podStartE2EDuration="4.975188206s" podCreationTimestamp="2026-01-28 07:45:25 +0000 UTC" firstStartedPulling="2026-01-28 07:45:26.908771247 +0000 UTC m=+3298.324431407" lastFinishedPulling="2026-01-28 07:45:29.358877155 +0000 UTC m=+3300.774537335" observedRunningTime="2026-01-28 07:45:29.966525962 +0000 UTC m=+3301.382186122" watchObservedRunningTime="2026-01-28 07:45:29.975188206 +0000 UTC m=+3301.390848366" Jan 28 07:45:35 crc kubenswrapper[4776]: I0128 07:45:35.700879 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zwzzz" Jan 28 07:45:35 crc kubenswrapper[4776]: I0128 07:45:35.701614 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zwzzz" Jan 28 07:45:35 crc kubenswrapper[4776]: I0128 07:45:35.786637 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zwzzz" Jan 28 07:45:36 crc kubenswrapper[4776]: I0128 07:45:36.048049 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zwzzz" Jan 28 07:45:37 crc kubenswrapper[4776]: I0128 07:45:37.098315 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwzzz"] Jan 28 07:45:38 crc kubenswrapper[4776]: I0128 07:45:38.015334 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zwzzz" podUID="9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2" containerName="registry-server" containerID="cri-o://863efc6ae867102a998f96c23cc0b5a3b361afb4b98b15ec3336ffcacc321829" gracePeriod=2 Jan 28 07:45:38 crc kubenswrapper[4776]: I0128 07:45:38.305873 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:45:38 crc kubenswrapper[4776]: E0128 07:45:38.306185 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:45:38 crc kubenswrapper[4776]: I0128 07:45:38.509686 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwzzz" Jan 28 07:45:38 crc kubenswrapper[4776]: I0128 07:45:38.533710 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kzgj\" (UniqueName: \"kubernetes.io/projected/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-kube-api-access-2kzgj\") pod \"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2\" (UID: \"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2\") " Jan 28 07:45:38 crc kubenswrapper[4776]: I0128 07:45:38.537719 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-catalog-content\") pod \"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2\" (UID: \"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2\") " Jan 28 07:45:38 crc kubenswrapper[4776]: I0128 07:45:38.537792 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-utilities\") pod \"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2\" (UID: \"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2\") " Jan 28 07:45:38 crc kubenswrapper[4776]: I0128 07:45:38.542822 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-kube-api-access-2kzgj" (OuterVolumeSpecName: "kube-api-access-2kzgj") pod "9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2" (UID: "9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2"). InnerVolumeSpecName "kube-api-access-2kzgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:45:38 crc kubenswrapper[4776]: I0128 07:45:38.543374 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-utilities" (OuterVolumeSpecName: "utilities") pod "9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2" (UID: "9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:45:38 crc kubenswrapper[4776]: I0128 07:45:38.589587 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2" (UID: "9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:45:38 crc kubenswrapper[4776]: I0128 07:45:38.640421 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kzgj\" (UniqueName: \"kubernetes.io/projected/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-kube-api-access-2kzgj\") on node \"crc\" DevicePath \"\"" Jan 28 07:45:38 crc kubenswrapper[4776]: I0128 07:45:38.640492 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:45:38 crc kubenswrapper[4776]: I0128 07:45:38.640508 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:45:39 crc kubenswrapper[4776]: I0128 07:45:39.028818 4776 generic.go:334] "Generic (PLEG): container finished" podID="9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2" containerID="863efc6ae867102a998f96c23cc0b5a3b361afb4b98b15ec3336ffcacc321829" exitCode=0 Jan 28 07:45:39 crc kubenswrapper[4776]: I0128 07:45:39.028880 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwzzz" Jan 28 07:45:39 crc kubenswrapper[4776]: I0128 07:45:39.029083 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwzzz" event={"ID":"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2","Type":"ContainerDied","Data":"863efc6ae867102a998f96c23cc0b5a3b361afb4b98b15ec3336ffcacc321829"} Jan 28 07:45:39 crc kubenswrapper[4776]: I0128 07:45:39.029146 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwzzz" event={"ID":"9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2","Type":"ContainerDied","Data":"5ee9f2fadfb8c06c3c0409debe65958c26f1a1acd23bb08575089dc36ba523c4"} Jan 28 07:45:39 crc kubenswrapper[4776]: I0128 07:45:39.029216 4776 scope.go:117] "RemoveContainer" containerID="863efc6ae867102a998f96c23cc0b5a3b361afb4b98b15ec3336ffcacc321829" Jan 28 07:45:39 crc kubenswrapper[4776]: I0128 07:45:39.073055 4776 scope.go:117] "RemoveContainer" containerID="23e4d7f2b37aa114cbbd8e8c33a8cc0636645c9c2981268ef894a16e13f67100" Jan 28 07:45:39 crc kubenswrapper[4776]: I0128 07:45:39.078358 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwzzz"] Jan 28 07:45:39 crc kubenswrapper[4776]: I0128 07:45:39.089524 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zwzzz"] Jan 28 07:45:39 crc kubenswrapper[4776]: I0128 07:45:39.098993 4776 scope.go:117] "RemoveContainer" containerID="16c7065b97c2452eb768081c758a46528a19f964cc5fd7f6f71f28bfe18343dd" Jan 28 07:45:39 crc kubenswrapper[4776]: I0128 07:45:39.159742 4776 scope.go:117] "RemoveContainer" containerID="863efc6ae867102a998f96c23cc0b5a3b361afb4b98b15ec3336ffcacc321829" Jan 28 07:45:39 crc kubenswrapper[4776]: E0128 07:45:39.160336 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"863efc6ae867102a998f96c23cc0b5a3b361afb4b98b15ec3336ffcacc321829\": container with ID starting with 863efc6ae867102a998f96c23cc0b5a3b361afb4b98b15ec3336ffcacc321829 not found: ID does not exist" containerID="863efc6ae867102a998f96c23cc0b5a3b361afb4b98b15ec3336ffcacc321829" Jan 28 07:45:39 crc kubenswrapper[4776]: I0128 07:45:39.160420 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"863efc6ae867102a998f96c23cc0b5a3b361afb4b98b15ec3336ffcacc321829"} err="failed to get container status \"863efc6ae867102a998f96c23cc0b5a3b361afb4b98b15ec3336ffcacc321829\": rpc error: code = NotFound desc = could not find container \"863efc6ae867102a998f96c23cc0b5a3b361afb4b98b15ec3336ffcacc321829\": container with ID starting with 863efc6ae867102a998f96c23cc0b5a3b361afb4b98b15ec3336ffcacc321829 not found: ID does not exist" Jan 28 07:45:39 crc kubenswrapper[4776]: I0128 07:45:39.160486 4776 scope.go:117] "RemoveContainer" containerID="23e4d7f2b37aa114cbbd8e8c33a8cc0636645c9c2981268ef894a16e13f67100" Jan 28 07:45:39 crc kubenswrapper[4776]: E0128 07:45:39.160920 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23e4d7f2b37aa114cbbd8e8c33a8cc0636645c9c2981268ef894a16e13f67100\": container with ID starting with 23e4d7f2b37aa114cbbd8e8c33a8cc0636645c9c2981268ef894a16e13f67100 not found: ID does not exist" containerID="23e4d7f2b37aa114cbbd8e8c33a8cc0636645c9c2981268ef894a16e13f67100" Jan 28 07:45:39 crc kubenswrapper[4776]: I0128 07:45:39.161039 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23e4d7f2b37aa114cbbd8e8c33a8cc0636645c9c2981268ef894a16e13f67100"} err="failed to get container status \"23e4d7f2b37aa114cbbd8e8c33a8cc0636645c9c2981268ef894a16e13f67100\": rpc error: code = NotFound desc = could not find container \"23e4d7f2b37aa114cbbd8e8c33a8cc0636645c9c2981268ef894a16e13f67100\": container with ID starting with 23e4d7f2b37aa114cbbd8e8c33a8cc0636645c9c2981268ef894a16e13f67100 not found: ID does not exist" Jan 28 07:45:39 crc kubenswrapper[4776]: I0128 07:45:39.161353 4776 scope.go:117] "RemoveContainer" containerID="16c7065b97c2452eb768081c758a46528a19f964cc5fd7f6f71f28bfe18343dd" Jan 28 07:45:39 crc kubenswrapper[4776]: E0128 07:45:39.161755 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16c7065b97c2452eb768081c758a46528a19f964cc5fd7f6f71f28bfe18343dd\": container with ID starting with 16c7065b97c2452eb768081c758a46528a19f964cc5fd7f6f71f28bfe18343dd not found: ID does not exist" containerID="16c7065b97c2452eb768081c758a46528a19f964cc5fd7f6f71f28bfe18343dd" Jan 28 07:45:39 crc kubenswrapper[4776]: I0128 07:45:39.161788 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16c7065b97c2452eb768081c758a46528a19f964cc5fd7f6f71f28bfe18343dd"} err="failed to get container status \"16c7065b97c2452eb768081c758a46528a19f964cc5fd7f6f71f28bfe18343dd\": rpc error: code = NotFound desc = could not find container \"16c7065b97c2452eb768081c758a46528a19f964cc5fd7f6f71f28bfe18343dd\": container with ID starting with 16c7065b97c2452eb768081c758a46528a19f964cc5fd7f6f71f28bfe18343dd not found: ID does not exist" Jan 28 07:45:39 crc kubenswrapper[4776]: I0128 07:45:39.321862 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2" path="/var/lib/kubelet/pods/9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2/volumes" Jan 28 07:45:49 crc kubenswrapper[4776]: I0128 07:45:49.376890 4776 scope.go:117] "RemoveContainer" containerID="19289311f5ffbf2895f689d8cc6011b96409df6687327543c8108b6d1cc218de" Jan 28 07:45:51 crc kubenswrapper[4776]: I0128 07:45:51.307151 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:45:51 crc kubenswrapper[4776]: E0128 07:45:51.307871 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:46:06 crc kubenswrapper[4776]: I0128 07:46:06.305064 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:46:06 crc kubenswrapper[4776]: E0128 07:46:06.305966 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:46:17 crc kubenswrapper[4776]: I0128 07:46:17.306747 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:46:17 crc kubenswrapper[4776]: E0128 07:46:17.307645 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:46:30 crc kubenswrapper[4776]: I0128 07:46:30.305375 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:46:30 crc kubenswrapper[4776]: E0128 07:46:30.306098 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:46:41 crc kubenswrapper[4776]: I0128 07:46:41.305053 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:46:41 crc kubenswrapper[4776]: E0128 07:46:41.306034 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:46:52 crc kubenswrapper[4776]: I0128 07:46:52.304823 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:46:52 crc kubenswrapper[4776]: E0128 07:46:52.305867 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:47:04 crc kubenswrapper[4776]: I0128 07:47:04.305170 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:47:04 crc kubenswrapper[4776]: E0128 07:47:04.305926 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:47:18 crc kubenswrapper[4776]: I0128 07:47:18.304639 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:47:18 crc kubenswrapper[4776]: E0128 07:47:18.305381 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:47:31 crc kubenswrapper[4776]: I0128 07:47:31.304864 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:47:31 crc kubenswrapper[4776]: E0128 07:47:31.306064 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:47:46 crc kubenswrapper[4776]: I0128 07:47:46.305900 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:47:47 crc kubenswrapper[4776]: I0128 07:47:47.350228 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"432f15ce1d09416b3615cc6d6d550416087724975f99751c0f2925985b3eadda"} Jan 28 07:47:57 crc kubenswrapper[4776]: I0128 07:47:57.948185 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qjmzr"] Jan 28 07:47:57 crc kubenswrapper[4776]: E0128 07:47:57.949301 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2" containerName="extract-content" Jan 28 07:47:57 crc kubenswrapper[4776]: I0128 07:47:57.949316 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2" containerName="extract-content" Jan 28 07:47:57 crc kubenswrapper[4776]: E0128 07:47:57.949328 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2" containerName="extract-utilities" Jan 28 07:47:57 crc kubenswrapper[4776]: I0128 07:47:57.949336 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2" containerName="extract-utilities" Jan 28 07:47:57 crc kubenswrapper[4776]: E0128 07:47:57.949357 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2" containerName="registry-server" Jan 28 07:47:57 crc kubenswrapper[4776]: I0128 07:47:57.949366 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2" containerName="registry-server" Jan 28 07:47:57 crc kubenswrapper[4776]: I0128 07:47:57.949693 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae11c68-adc6-4a0a-b906-d4e1f6b8d6c2" containerName="registry-server" Jan 28 07:47:57 crc kubenswrapper[4776]: I0128 07:47:57.951421 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjmzr" Jan 28 07:47:57 crc kubenswrapper[4776]: I0128 07:47:57.958790 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qjmzr"] Jan 28 07:47:58 crc kubenswrapper[4776]: I0128 07:47:58.080105 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99e5c4c0-c53b-4634-9c82-5ce8f362744c-catalog-content\") pod \"certified-operators-qjmzr\" (UID: \"99e5c4c0-c53b-4634-9c82-5ce8f362744c\") " pod="openshift-marketplace/certified-operators-qjmzr" Jan 28 07:47:58 crc kubenswrapper[4776]: I0128 07:47:58.080208 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99e5c4c0-c53b-4634-9c82-5ce8f362744c-utilities\") pod \"certified-operators-qjmzr\" (UID: \"99e5c4c0-c53b-4634-9c82-5ce8f362744c\") " pod="openshift-marketplace/certified-operators-qjmzr" Jan 28 07:47:58 crc kubenswrapper[4776]: I0128 07:47:58.080269 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw5xv\" (UniqueName: \"kubernetes.io/projected/99e5c4c0-c53b-4634-9c82-5ce8f362744c-kube-api-access-tw5xv\") pod \"certified-operators-qjmzr\" (UID: \"99e5c4c0-c53b-4634-9c82-5ce8f362744c\") " pod="openshift-marketplace/certified-operators-qjmzr" Jan 28 07:47:58 crc kubenswrapper[4776]: I0128 07:47:58.181982 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99e5c4c0-c53b-4634-9c82-5ce8f362744c-catalog-content\") pod \"certified-operators-qjmzr\" (UID: \"99e5c4c0-c53b-4634-9c82-5ce8f362744c\") " pod="openshift-marketplace/certified-operators-qjmzr" Jan 28 07:47:58 crc kubenswrapper[4776]: I0128 07:47:58.182040 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99e5c4c0-c53b-4634-9c82-5ce8f362744c-utilities\") pod \"certified-operators-qjmzr\" (UID: \"99e5c4c0-c53b-4634-9c82-5ce8f362744c\") " pod="openshift-marketplace/certified-operators-qjmzr" Jan 28 07:47:58 crc kubenswrapper[4776]: I0128 07:47:58.182091 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw5xv\" (UniqueName: \"kubernetes.io/projected/99e5c4c0-c53b-4634-9c82-5ce8f362744c-kube-api-access-tw5xv\") pod \"certified-operators-qjmzr\" (UID: \"99e5c4c0-c53b-4634-9c82-5ce8f362744c\") " pod="openshift-marketplace/certified-operators-qjmzr" Jan 28 07:47:58 crc kubenswrapper[4776]: I0128 07:47:58.182496 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99e5c4c0-c53b-4634-9c82-5ce8f362744c-catalog-content\") pod \"certified-operators-qjmzr\" (UID: \"99e5c4c0-c53b-4634-9c82-5ce8f362744c\") " pod="openshift-marketplace/certified-operators-qjmzr" Jan 28 07:47:58 crc kubenswrapper[4776]: I0128 07:47:58.182538 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99e5c4c0-c53b-4634-9c82-5ce8f362744c-utilities\") pod \"certified-operators-qjmzr\" (UID: \"99e5c4c0-c53b-4634-9c82-5ce8f362744c\") " pod="openshift-marketplace/certified-operators-qjmzr" Jan 28 07:47:58 crc kubenswrapper[4776]: I0128 07:47:58.207369 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw5xv\" (UniqueName: \"kubernetes.io/projected/99e5c4c0-c53b-4634-9c82-5ce8f362744c-kube-api-access-tw5xv\") pod \"certified-operators-qjmzr\" (UID: \"99e5c4c0-c53b-4634-9c82-5ce8f362744c\") " pod="openshift-marketplace/certified-operators-qjmzr" Jan 28 07:47:58 crc kubenswrapper[4776]: I0128 07:47:58.281766 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjmzr" Jan 28 07:47:58 crc kubenswrapper[4776]: I0128 07:47:58.784922 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qjmzr"] Jan 28 07:47:59 crc kubenswrapper[4776]: I0128 07:47:59.472441 4776 generic.go:334] "Generic (PLEG): container finished" podID="99e5c4c0-c53b-4634-9c82-5ce8f362744c" containerID="881d23cce8d598846d4d5babc728cf90d7b54d7a776bec3b77e2677300151564" exitCode=0 Jan 28 07:47:59 crc kubenswrapper[4776]: I0128 07:47:59.472529 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjmzr" event={"ID":"99e5c4c0-c53b-4634-9c82-5ce8f362744c","Type":"ContainerDied","Data":"881d23cce8d598846d4d5babc728cf90d7b54d7a776bec3b77e2677300151564"} Jan 28 07:47:59 crc kubenswrapper[4776]: I0128 07:47:59.472828 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjmzr" event={"ID":"99e5c4c0-c53b-4634-9c82-5ce8f362744c","Type":"ContainerStarted","Data":"c104d0a09e295df9a519265cd38662cb41474f57c7fe23eeb7afee9c180d285e"} Jan 28 07:47:59 crc kubenswrapper[4776]: I0128 07:47:59.475472 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 07:48:00 crc kubenswrapper[4776]: I0128 07:48:00.485373 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjmzr" event={"ID":"99e5c4c0-c53b-4634-9c82-5ce8f362744c","Type":"ContainerStarted","Data":"a97b346b0a81b9fd3d4345676a5248ea656974e2b1dd6af63afdb06e67383970"} Jan 28 07:48:01 crc kubenswrapper[4776]: I0128 07:48:01.494709 4776 generic.go:334] "Generic (PLEG): container finished" podID="99e5c4c0-c53b-4634-9c82-5ce8f362744c" containerID="a97b346b0a81b9fd3d4345676a5248ea656974e2b1dd6af63afdb06e67383970" exitCode=0 Jan 28 07:48:01 crc kubenswrapper[4776]: I0128 07:48:01.494764 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjmzr" event={"ID":"99e5c4c0-c53b-4634-9c82-5ce8f362744c","Type":"ContainerDied","Data":"a97b346b0a81b9fd3d4345676a5248ea656974e2b1dd6af63afdb06e67383970"} Jan 28 07:48:02 crc kubenswrapper[4776]: I0128 07:48:02.508582 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjmzr" event={"ID":"99e5c4c0-c53b-4634-9c82-5ce8f362744c","Type":"ContainerStarted","Data":"1a9c53df6a509a37bb60e16cea041fe6975d00d9ac42fb9e983c1ddc3cdda84c"} Jan 28 07:48:02 crc kubenswrapper[4776]: I0128 07:48:02.533347 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qjmzr" podStartSLOduration=2.90957355 podStartE2EDuration="5.533326176s" podCreationTimestamp="2026-01-28 07:47:57 +0000 UTC" firstStartedPulling="2026-01-28 07:47:59.475197912 +0000 UTC m=+3450.890858072" lastFinishedPulling="2026-01-28 07:48:02.098950528 +0000 UTC m=+3453.514610698" observedRunningTime="2026-01-28 07:48:02.524510769 +0000 UTC m=+3453.940170949" watchObservedRunningTime="2026-01-28 07:48:02.533326176 +0000 UTC m=+3453.948986336" Jan 28 07:48:08 crc kubenswrapper[4776]: I0128 07:48:08.283863 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qjmzr" Jan 28 07:48:08 crc kubenswrapper[4776]: I0128 07:48:08.284351 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qjmzr" Jan 28 07:48:08 crc kubenswrapper[4776]: I0128 07:48:08.352634 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qjmzr" Jan 28 07:48:08 crc kubenswrapper[4776]: I0128 07:48:08.753860 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qjmzr" Jan 28 07:48:08 crc kubenswrapper[4776]: I0128 07:48:08.808395 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjmzr"] Jan 28 07:48:10 crc kubenswrapper[4776]: I0128 07:48:10.594191 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qjmzr" podUID="99e5c4c0-c53b-4634-9c82-5ce8f362744c" containerName="registry-server" containerID="cri-o://1a9c53df6a509a37bb60e16cea041fe6975d00d9ac42fb9e983c1ddc3cdda84c" gracePeriod=2 Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.161487 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjmzr" Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.266516 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99e5c4c0-c53b-4634-9c82-5ce8f362744c-utilities\") pod \"99e5c4c0-c53b-4634-9c82-5ce8f362744c\" (UID: \"99e5c4c0-c53b-4634-9c82-5ce8f362744c\") " Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.266661 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw5xv\" (UniqueName: \"kubernetes.io/projected/99e5c4c0-c53b-4634-9c82-5ce8f362744c-kube-api-access-tw5xv\") pod \"99e5c4c0-c53b-4634-9c82-5ce8f362744c\" (UID: \"99e5c4c0-c53b-4634-9c82-5ce8f362744c\") " Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.266773 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99e5c4c0-c53b-4634-9c82-5ce8f362744c-catalog-content\") pod \"99e5c4c0-c53b-4634-9c82-5ce8f362744c\" (UID: \"99e5c4c0-c53b-4634-9c82-5ce8f362744c\") " Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.267142 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99e5c4c0-c53b-4634-9c82-5ce8f362744c-utilities" (OuterVolumeSpecName: "utilities") pod "99e5c4c0-c53b-4634-9c82-5ce8f362744c" (UID: "99e5c4c0-c53b-4634-9c82-5ce8f362744c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.267658 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99e5c4c0-c53b-4634-9c82-5ce8f362744c-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.272348 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e5c4c0-c53b-4634-9c82-5ce8f362744c-kube-api-access-tw5xv" (OuterVolumeSpecName: "kube-api-access-tw5xv") pod "99e5c4c0-c53b-4634-9c82-5ce8f362744c" (UID: "99e5c4c0-c53b-4634-9c82-5ce8f362744c"). InnerVolumeSpecName "kube-api-access-tw5xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.309096 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99e5c4c0-c53b-4634-9c82-5ce8f362744c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99e5c4c0-c53b-4634-9c82-5ce8f362744c" (UID: "99e5c4c0-c53b-4634-9c82-5ce8f362744c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.369298 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw5xv\" (UniqueName: \"kubernetes.io/projected/99e5c4c0-c53b-4634-9c82-5ce8f362744c-kube-api-access-tw5xv\") on node \"crc\" DevicePath \"\"" Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.369324 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99e5c4c0-c53b-4634-9c82-5ce8f362744c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.607826 4776 generic.go:334] "Generic (PLEG): container finished" podID="99e5c4c0-c53b-4634-9c82-5ce8f362744c" containerID="1a9c53df6a509a37bb60e16cea041fe6975d00d9ac42fb9e983c1ddc3cdda84c" exitCode=0 Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.607898 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjmzr" Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.607925 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjmzr" event={"ID":"99e5c4c0-c53b-4634-9c82-5ce8f362744c","Type":"ContainerDied","Data":"1a9c53df6a509a37bb60e16cea041fe6975d00d9ac42fb9e983c1ddc3cdda84c"} Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.608362 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjmzr" event={"ID":"99e5c4c0-c53b-4634-9c82-5ce8f362744c","Type":"ContainerDied","Data":"c104d0a09e295df9a519265cd38662cb41474f57c7fe23eeb7afee9c180d285e"} Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.608411 4776 scope.go:117] "RemoveContainer" containerID="1a9c53df6a509a37bb60e16cea041fe6975d00d9ac42fb9e983c1ddc3cdda84c" Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.639419 4776 scope.go:117] "RemoveContainer" containerID="a97b346b0a81b9fd3d4345676a5248ea656974e2b1dd6af63afdb06e67383970" Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.642508 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjmzr"] Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.654272 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qjmzr"] Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.674288 4776 scope.go:117] "RemoveContainer" containerID="881d23cce8d598846d4d5babc728cf90d7b54d7a776bec3b77e2677300151564" Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.723425 4776 scope.go:117] "RemoveContainer" containerID="1a9c53df6a509a37bb60e16cea041fe6975d00d9ac42fb9e983c1ddc3cdda84c" Jan 28 07:48:11 crc kubenswrapper[4776]: E0128 07:48:11.724854 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a9c53df6a509a37bb60e16cea041fe6975d00d9ac42fb9e983c1ddc3cdda84c\": container with ID starting with 1a9c53df6a509a37bb60e16cea041fe6975d00d9ac42fb9e983c1ddc3cdda84c not found: ID does not exist" containerID="1a9c53df6a509a37bb60e16cea041fe6975d00d9ac42fb9e983c1ddc3cdda84c" Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.724939 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a9c53df6a509a37bb60e16cea041fe6975d00d9ac42fb9e983c1ddc3cdda84c"} err="failed to get container status \"1a9c53df6a509a37bb60e16cea041fe6975d00d9ac42fb9e983c1ddc3cdda84c\": rpc error: code = NotFound desc = could not find container \"1a9c53df6a509a37bb60e16cea041fe6975d00d9ac42fb9e983c1ddc3cdda84c\": container with ID starting with 1a9c53df6a509a37bb60e16cea041fe6975d00d9ac42fb9e983c1ddc3cdda84c not found: ID does not exist" Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.725020 4776 scope.go:117] "RemoveContainer" containerID="a97b346b0a81b9fd3d4345676a5248ea656974e2b1dd6af63afdb06e67383970" Jan 28 07:48:11 crc kubenswrapper[4776]: E0128 07:48:11.728869 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a97b346b0a81b9fd3d4345676a5248ea656974e2b1dd6af63afdb06e67383970\": container with ID starting with a97b346b0a81b9fd3d4345676a5248ea656974e2b1dd6af63afdb06e67383970 not found: ID does not exist" containerID="a97b346b0a81b9fd3d4345676a5248ea656974e2b1dd6af63afdb06e67383970" Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.729028 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a97b346b0a81b9fd3d4345676a5248ea656974e2b1dd6af63afdb06e67383970"} err="failed to get container status \"a97b346b0a81b9fd3d4345676a5248ea656974e2b1dd6af63afdb06e67383970\": rpc error: code = NotFound desc = could not find container \"a97b346b0a81b9fd3d4345676a5248ea656974e2b1dd6af63afdb06e67383970\": container with ID starting with a97b346b0a81b9fd3d4345676a5248ea656974e2b1dd6af63afdb06e67383970 not found: ID does not exist" Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.729132 4776 scope.go:117] "RemoveContainer" containerID="881d23cce8d598846d4d5babc728cf90d7b54d7a776bec3b77e2677300151564" Jan 28 07:48:11 crc kubenswrapper[4776]: E0128 07:48:11.729689 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"881d23cce8d598846d4d5babc728cf90d7b54d7a776bec3b77e2677300151564\": container with ID starting with 881d23cce8d598846d4d5babc728cf90d7b54d7a776bec3b77e2677300151564 not found: ID does not exist" containerID="881d23cce8d598846d4d5babc728cf90d7b54d7a776bec3b77e2677300151564" Jan 28 07:48:11 crc kubenswrapper[4776]: I0128 07:48:11.729724 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"881d23cce8d598846d4d5babc728cf90d7b54d7a776bec3b77e2677300151564"} err="failed to get container status \"881d23cce8d598846d4d5babc728cf90d7b54d7a776bec3b77e2677300151564\": rpc error: code = NotFound desc = could not find container \"881d23cce8d598846d4d5babc728cf90d7b54d7a776bec3b77e2677300151564\": container with ID starting with 881d23cce8d598846d4d5babc728cf90d7b54d7a776bec3b77e2677300151564 not found: ID does not exist" Jan 28 07:48:13 crc kubenswrapper[4776]: I0128 07:48:13.323229 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99e5c4c0-c53b-4634-9c82-5ce8f362744c" path="/var/lib/kubelet/pods/99e5c4c0-c53b-4634-9c82-5ce8f362744c/volumes" Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.047889 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m89h4"] Jan 28 07:48:35 crc kubenswrapper[4776]: E0128 07:48:35.048732 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e5c4c0-c53b-4634-9c82-5ce8f362744c" containerName="extract-utilities" Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.048745 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e5c4c0-c53b-4634-9c82-5ce8f362744c" containerName="extract-utilities" Jan 28 07:48:35 crc kubenswrapper[4776]: E0128 07:48:35.048771 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e5c4c0-c53b-4634-9c82-5ce8f362744c" containerName="registry-server" Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.048777 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e5c4c0-c53b-4634-9c82-5ce8f362744c" containerName="registry-server" Jan 28 07:48:35 crc kubenswrapper[4776]: E0128 07:48:35.048787 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e5c4c0-c53b-4634-9c82-5ce8f362744c" containerName="extract-content" Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.048793 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e5c4c0-c53b-4634-9c82-5ce8f362744c" containerName="extract-content" Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.048969 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e5c4c0-c53b-4634-9c82-5ce8f362744c" containerName="registry-server" Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.050269 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m89h4" Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.066107 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m89h4"] Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.183828 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ztwt\" (UniqueName: \"kubernetes.io/projected/e868dbea-c6b4-4da5-9960-a911c7a3f65e-kube-api-access-8ztwt\") pod \"redhat-operators-m89h4\" (UID: \"e868dbea-c6b4-4da5-9960-a911c7a3f65e\") " pod="openshift-marketplace/redhat-operators-m89h4" Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.184173 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e868dbea-c6b4-4da5-9960-a911c7a3f65e-catalog-content\") pod \"redhat-operators-m89h4\" (UID: \"e868dbea-c6b4-4da5-9960-a911c7a3f65e\") " pod="openshift-marketplace/redhat-operators-m89h4" Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.184319 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e868dbea-c6b4-4da5-9960-a911c7a3f65e-utilities\") pod \"redhat-operators-m89h4\" (UID: \"e868dbea-c6b4-4da5-9960-a911c7a3f65e\") " pod="openshift-marketplace/redhat-operators-m89h4" Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.286133 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ztwt\" (UniqueName: \"kubernetes.io/projected/e868dbea-c6b4-4da5-9960-a911c7a3f65e-kube-api-access-8ztwt\") pod \"redhat-operators-m89h4\" (UID: \"e868dbea-c6b4-4da5-9960-a911c7a3f65e\") " pod="openshift-marketplace/redhat-operators-m89h4" Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.286177 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e868dbea-c6b4-4da5-9960-a911c7a3f65e-catalog-content\") pod \"redhat-operators-m89h4\" (UID: \"e868dbea-c6b4-4da5-9960-a911c7a3f65e\") " pod="openshift-marketplace/redhat-operators-m89h4" Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.286205 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e868dbea-c6b4-4da5-9960-a911c7a3f65e-utilities\") pod \"redhat-operators-m89h4\" (UID: \"e868dbea-c6b4-4da5-9960-a911c7a3f65e\") " pod="openshift-marketplace/redhat-operators-m89h4" Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.286644 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e868dbea-c6b4-4da5-9960-a911c7a3f65e-catalog-content\") pod \"redhat-operators-m89h4\" (UID: \"e868dbea-c6b4-4da5-9960-a911c7a3f65e\") " pod="openshift-marketplace/redhat-operators-m89h4" Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.286702 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e868dbea-c6b4-4da5-9960-a911c7a3f65e-utilities\") pod \"redhat-operators-m89h4\" (UID: \"e868dbea-c6b4-4da5-9960-a911c7a3f65e\") " pod="openshift-marketplace/redhat-operators-m89h4" Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.306606 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ztwt\" (UniqueName: \"kubernetes.io/projected/e868dbea-c6b4-4da5-9960-a911c7a3f65e-kube-api-access-8ztwt\") pod \"redhat-operators-m89h4\" (UID: \"e868dbea-c6b4-4da5-9960-a911c7a3f65e\") " pod="openshift-marketplace/redhat-operators-m89h4" Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.381252 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m89h4" Jan 28 07:48:35 crc kubenswrapper[4776]: I0128 07:48:35.932667 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m89h4"] Jan 28 07:48:36 crc kubenswrapper[4776]: I0128 07:48:36.889238 4776 generic.go:334] "Generic (PLEG): container finished" podID="e868dbea-c6b4-4da5-9960-a911c7a3f65e" containerID="c0841c1ce6c5f26dec0bf1a238b784441d6850c0abe584921ae0e2310dfae1b0" exitCode=0 Jan 28 07:48:36 crc kubenswrapper[4776]: I0128 07:48:36.889495 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m89h4" event={"ID":"e868dbea-c6b4-4da5-9960-a911c7a3f65e","Type":"ContainerDied","Data":"c0841c1ce6c5f26dec0bf1a238b784441d6850c0abe584921ae0e2310dfae1b0"} Jan 28 07:48:36 crc kubenswrapper[4776]: I0128 07:48:36.889727 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m89h4" event={"ID":"e868dbea-c6b4-4da5-9960-a911c7a3f65e","Type":"ContainerStarted","Data":"02cc18e028cb5955e8da713fcb45b031752e32a158c6f30cd0e3c50a4ec13a1c"} Jan 28 07:48:38 crc kubenswrapper[4776]: I0128 07:48:38.910954 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m89h4" event={"ID":"e868dbea-c6b4-4da5-9960-a911c7a3f65e","Type":"ContainerStarted","Data":"325f330f425878c8a2d2994ea6b03b57dd9cc91a57fe71121db34db7e65caa01"} Jan 28 07:48:42 crc kubenswrapper[4776]: I0128 07:48:42.949525 4776 generic.go:334] "Generic (PLEG): container finished" podID="e868dbea-c6b4-4da5-9960-a911c7a3f65e" containerID="325f330f425878c8a2d2994ea6b03b57dd9cc91a57fe71121db34db7e65caa01" exitCode=0 Jan 28 07:48:42 crc kubenswrapper[4776]: I0128 07:48:42.949585 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m89h4" event={"ID":"e868dbea-c6b4-4da5-9960-a911c7a3f65e","Type":"ContainerDied","Data":"325f330f425878c8a2d2994ea6b03b57dd9cc91a57fe71121db34db7e65caa01"} Jan 28 07:48:43 crc kubenswrapper[4776]: I0128 07:48:43.960952 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m89h4" event={"ID":"e868dbea-c6b4-4da5-9960-a911c7a3f65e","Type":"ContainerStarted","Data":"36a9aa09aae8215fb1224bab18dcecfeb8ace2853967e06ef5fd23b96e7c66b1"} Jan 28 07:48:43 crc kubenswrapper[4776]: I0128 07:48:43.979222 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m89h4" podStartSLOduration=2.478314595 podStartE2EDuration="8.979206877s" podCreationTimestamp="2026-01-28 07:48:35 +0000 UTC" firstStartedPulling="2026-01-28 07:48:36.902899671 +0000 UTC m=+3488.318559851" lastFinishedPulling="2026-01-28 07:48:43.403791963 +0000 UTC m=+3494.819452133" observedRunningTime="2026-01-28 07:48:43.974048867 +0000 UTC m=+3495.389709027" watchObservedRunningTime="2026-01-28 07:48:43.979206877 +0000 UTC m=+3495.394867037" Jan 28 07:48:45 crc kubenswrapper[4776]: I0128 07:48:45.382001 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m89h4" Jan 28 07:48:45 crc kubenswrapper[4776]: I0128 07:48:45.382582 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m89h4" Jan 28 07:48:46 crc kubenswrapper[4776]: I0128 07:48:46.444063 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m89h4" podUID="e868dbea-c6b4-4da5-9960-a911c7a3f65e" containerName="registry-server" probeResult="failure" output=< Jan 28 07:48:46 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Jan 28 07:48:46 crc kubenswrapper[4776]: > Jan 28 07:48:56 crc kubenswrapper[4776]: I0128 07:48:56.447754 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m89h4" podUID="e868dbea-c6b4-4da5-9960-a911c7a3f65e" containerName="registry-server" probeResult="failure" output=< Jan 28 07:48:56 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Jan 28 07:48:56 crc kubenswrapper[4776]: > Jan 28 07:49:05 crc kubenswrapper[4776]: I0128 07:49:05.458857 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m89h4" Jan 28 07:49:05 crc kubenswrapper[4776]: I0128 07:49:05.529140 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m89h4" Jan 28 07:49:06 crc kubenswrapper[4776]: I0128 07:49:06.249101 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m89h4"] Jan 28 07:49:07 crc kubenswrapper[4776]: I0128 07:49:07.212716 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m89h4" podUID="e868dbea-c6b4-4da5-9960-a911c7a3f65e" containerName="registry-server" containerID="cri-o://36a9aa09aae8215fb1224bab18dcecfeb8ace2853967e06ef5fd23b96e7c66b1" gracePeriod=2 Jan 28 07:49:07 crc kubenswrapper[4776]: I0128 07:49:07.780303 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m89h4" Jan 28 07:49:07 crc kubenswrapper[4776]: I0128 07:49:07.903937 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ztwt\" (UniqueName: \"kubernetes.io/projected/e868dbea-c6b4-4da5-9960-a911c7a3f65e-kube-api-access-8ztwt\") pod \"e868dbea-c6b4-4da5-9960-a911c7a3f65e\" (UID: \"e868dbea-c6b4-4da5-9960-a911c7a3f65e\") " Jan 28 07:49:07 crc kubenswrapper[4776]: I0128 07:49:07.903998 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e868dbea-c6b4-4da5-9960-a911c7a3f65e-catalog-content\") pod \"e868dbea-c6b4-4da5-9960-a911c7a3f65e\" (UID: \"e868dbea-c6b4-4da5-9960-a911c7a3f65e\") " Jan 28 07:49:07 crc kubenswrapper[4776]: I0128 07:49:07.904080 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e868dbea-c6b4-4da5-9960-a911c7a3f65e-utilities\") pod \"e868dbea-c6b4-4da5-9960-a911c7a3f65e\" (UID: \"e868dbea-c6b4-4da5-9960-a911c7a3f65e\") " Jan 28 07:49:07 crc kubenswrapper[4776]: I0128 07:49:07.904931 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e868dbea-c6b4-4da5-9960-a911c7a3f65e-utilities" (OuterVolumeSpecName: "utilities") pod "e868dbea-c6b4-4da5-9960-a911c7a3f65e" (UID: "e868dbea-c6b4-4da5-9960-a911c7a3f65e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:49:07 crc kubenswrapper[4776]: I0128 07:49:07.916739 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e868dbea-c6b4-4da5-9960-a911c7a3f65e-kube-api-access-8ztwt" (OuterVolumeSpecName: "kube-api-access-8ztwt") pod "e868dbea-c6b4-4da5-9960-a911c7a3f65e" (UID: "e868dbea-c6b4-4da5-9960-a911c7a3f65e"). InnerVolumeSpecName "kube-api-access-8ztwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.006251 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e868dbea-c6b4-4da5-9960-a911c7a3f65e-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.006292 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ztwt\" (UniqueName: \"kubernetes.io/projected/e868dbea-c6b4-4da5-9960-a911c7a3f65e-kube-api-access-8ztwt\") on node \"crc\" DevicePath \"\"" Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.028050 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e868dbea-c6b4-4da5-9960-a911c7a3f65e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e868dbea-c6b4-4da5-9960-a911c7a3f65e" (UID: "e868dbea-c6b4-4da5-9960-a911c7a3f65e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.108780 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e868dbea-c6b4-4da5-9960-a911c7a3f65e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.225998 4776 generic.go:334] "Generic (PLEG): container finished" podID="e868dbea-c6b4-4da5-9960-a911c7a3f65e" containerID="36a9aa09aae8215fb1224bab18dcecfeb8ace2853967e06ef5fd23b96e7c66b1" exitCode=0 Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.226039 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m89h4" event={"ID":"e868dbea-c6b4-4da5-9960-a911c7a3f65e","Type":"ContainerDied","Data":"36a9aa09aae8215fb1224bab18dcecfeb8ace2853967e06ef5fd23b96e7c66b1"} Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.226066 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m89h4" event={"ID":"e868dbea-c6b4-4da5-9960-a911c7a3f65e","Type":"ContainerDied","Data":"02cc18e028cb5955e8da713fcb45b031752e32a158c6f30cd0e3c50a4ec13a1c"} Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.226085 4776 scope.go:117] "RemoveContainer" containerID="36a9aa09aae8215fb1224bab18dcecfeb8ace2853967e06ef5fd23b96e7c66b1" Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.226149 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m89h4" Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.256884 4776 scope.go:117] "RemoveContainer" containerID="325f330f425878c8a2d2994ea6b03b57dd9cc91a57fe71121db34db7e65caa01" Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.272232 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m89h4"] Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.282889 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m89h4"] Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.298188 4776 scope.go:117] "RemoveContainer" containerID="c0841c1ce6c5f26dec0bf1a238b784441d6850c0abe584921ae0e2310dfae1b0" Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.356535 4776 scope.go:117] "RemoveContainer" containerID="36a9aa09aae8215fb1224bab18dcecfeb8ace2853967e06ef5fd23b96e7c66b1" Jan 28 07:49:08 crc kubenswrapper[4776]: E0128 07:49:08.356912 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36a9aa09aae8215fb1224bab18dcecfeb8ace2853967e06ef5fd23b96e7c66b1\": container with ID starting with 36a9aa09aae8215fb1224bab18dcecfeb8ace2853967e06ef5fd23b96e7c66b1 not found: ID does not exist" containerID="36a9aa09aae8215fb1224bab18dcecfeb8ace2853967e06ef5fd23b96e7c66b1" Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.356950 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36a9aa09aae8215fb1224bab18dcecfeb8ace2853967e06ef5fd23b96e7c66b1"} err="failed to get container status \"36a9aa09aae8215fb1224bab18dcecfeb8ace2853967e06ef5fd23b96e7c66b1\": rpc error: code = NotFound desc = could not find container \"36a9aa09aae8215fb1224bab18dcecfeb8ace2853967e06ef5fd23b96e7c66b1\": container with ID starting with 36a9aa09aae8215fb1224bab18dcecfeb8ace2853967e06ef5fd23b96e7c66b1 not found: ID does not exist" Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.356969 4776 scope.go:117] "RemoveContainer" containerID="325f330f425878c8a2d2994ea6b03b57dd9cc91a57fe71121db34db7e65caa01" Jan 28 07:49:08 crc kubenswrapper[4776]: E0128 07:49:08.357436 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"325f330f425878c8a2d2994ea6b03b57dd9cc91a57fe71121db34db7e65caa01\": container with ID starting with 325f330f425878c8a2d2994ea6b03b57dd9cc91a57fe71121db34db7e65caa01 not found: ID does not exist" containerID="325f330f425878c8a2d2994ea6b03b57dd9cc91a57fe71121db34db7e65caa01" Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.357474 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"325f330f425878c8a2d2994ea6b03b57dd9cc91a57fe71121db34db7e65caa01"} err="failed to get container status \"325f330f425878c8a2d2994ea6b03b57dd9cc91a57fe71121db34db7e65caa01\": rpc error: code = NotFound desc = could not find container \"325f330f425878c8a2d2994ea6b03b57dd9cc91a57fe71121db34db7e65caa01\": container with ID starting with 325f330f425878c8a2d2994ea6b03b57dd9cc91a57fe71121db34db7e65caa01 not found: ID does not exist" Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.357494 4776 scope.go:117] "RemoveContainer" containerID="c0841c1ce6c5f26dec0bf1a238b784441d6850c0abe584921ae0e2310dfae1b0" Jan 28 07:49:08 crc kubenswrapper[4776]: E0128 07:49:08.357812 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0841c1ce6c5f26dec0bf1a238b784441d6850c0abe584921ae0e2310dfae1b0\": container with ID starting with c0841c1ce6c5f26dec0bf1a238b784441d6850c0abe584921ae0e2310dfae1b0 not found: ID does not exist" containerID="c0841c1ce6c5f26dec0bf1a238b784441d6850c0abe584921ae0e2310dfae1b0" Jan 28 07:49:08 crc kubenswrapper[4776]: I0128 07:49:08.357843 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0841c1ce6c5f26dec0bf1a238b784441d6850c0abe584921ae0e2310dfae1b0"} err="failed to get container status \"c0841c1ce6c5f26dec0bf1a238b784441d6850c0abe584921ae0e2310dfae1b0\": rpc error: code = NotFound desc = could not find container \"c0841c1ce6c5f26dec0bf1a238b784441d6850c0abe584921ae0e2310dfae1b0\": container with ID starting with c0841c1ce6c5f26dec0bf1a238b784441d6850c0abe584921ae0e2310dfae1b0 not found: ID does not exist" Jan 28 07:49:09 crc kubenswrapper[4776]: I0128 07:49:09.315998 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e868dbea-c6b4-4da5-9960-a911c7a3f65e" path="/var/lib/kubelet/pods/e868dbea-c6b4-4da5-9960-a911c7a3f65e/volumes" Jan 28 07:50:03 crc kubenswrapper[4776]: I0128 07:50:03.852468 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:50:03 crc kubenswrapper[4776]: I0128 07:50:03.853013 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:50:33 crc kubenswrapper[4776]: I0128 07:50:33.851822 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:50:33 crc kubenswrapper[4776]: I0128 07:50:33.852625 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:51:03 crc kubenswrapper[4776]: I0128 07:51:03.851894 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:51:03 crc kubenswrapper[4776]: I0128 07:51:03.852467 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:51:03 crc kubenswrapper[4776]: I0128 07:51:03.852516 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 07:51:03 crc kubenswrapper[4776]: I0128 07:51:03.853300 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"432f15ce1d09416b3615cc6d6d550416087724975f99751c0f2925985b3eadda"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 07:51:03 crc kubenswrapper[4776]: I0128 07:51:03.853358 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://432f15ce1d09416b3615cc6d6d550416087724975f99751c0f2925985b3eadda" gracePeriod=600 Jan 28 07:51:04 crc kubenswrapper[4776]: I0128 07:51:04.374414 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="432f15ce1d09416b3615cc6d6d550416087724975f99751c0f2925985b3eadda" exitCode=0 Jan 28 07:51:04 crc kubenswrapper[4776]: I0128 07:51:04.374504 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"432f15ce1d09416b3615cc6d6d550416087724975f99751c0f2925985b3eadda"} Jan 28 07:51:04 crc kubenswrapper[4776]: I0128 07:51:04.374806 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12"} Jan 28 07:51:04 crc kubenswrapper[4776]: I0128 07:51:04.374832 4776 scope.go:117] "RemoveContainer" containerID="c82e8a23ef00f85222f03d582aba52ea2ee4acaa05bd627ffb4bf7c5ff8ac2c0" Jan 28 07:53:21 crc kubenswrapper[4776]: I0128 07:53:21.390407 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s257w"] Jan 28 07:53:21 crc kubenswrapper[4776]: E0128 07:53:21.394011 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e868dbea-c6b4-4da5-9960-a911c7a3f65e" containerName="registry-server" Jan 28 07:53:21 crc kubenswrapper[4776]: I0128 07:53:21.394030 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e868dbea-c6b4-4da5-9960-a911c7a3f65e" containerName="registry-server" Jan 28 07:53:21 crc kubenswrapper[4776]: E0128 07:53:21.394062 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e868dbea-c6b4-4da5-9960-a911c7a3f65e" containerName="extract-utilities" Jan 28 07:53:21 crc kubenswrapper[4776]: I0128 07:53:21.394070 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e868dbea-c6b4-4da5-9960-a911c7a3f65e" containerName="extract-utilities" Jan 28 07:53:21 crc kubenswrapper[4776]: E0128 07:53:21.394086 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e868dbea-c6b4-4da5-9960-a911c7a3f65e" containerName="extract-content" Jan 28 07:53:21 crc kubenswrapper[4776]: I0128 07:53:21.394095 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e868dbea-c6b4-4da5-9960-a911c7a3f65e" containerName="extract-content" Jan 28 07:53:21 crc kubenswrapper[4776]: I0128 07:53:21.394334 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e868dbea-c6b4-4da5-9960-a911c7a3f65e" containerName="registry-server" Jan 28 07:53:21 crc kubenswrapper[4776]: I0128 07:53:21.396045 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s257w" Jan 28 07:53:21 crc kubenswrapper[4776]: I0128 07:53:21.413048 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s257w"] Jan 28 07:53:21 crc kubenswrapper[4776]: I0128 07:53:21.513944 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njj6l\" (UniqueName: \"kubernetes.io/projected/8783e50a-206a-45a9-886f-d56f66c39f96-kube-api-access-njj6l\") pod \"redhat-marketplace-s257w\" (UID: \"8783e50a-206a-45a9-886f-d56f66c39f96\") " pod="openshift-marketplace/redhat-marketplace-s257w" Jan 28 07:53:21 crc kubenswrapper[4776]: I0128 07:53:21.514401 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8783e50a-206a-45a9-886f-d56f66c39f96-catalog-content\") pod \"redhat-marketplace-s257w\" (UID: \"8783e50a-206a-45a9-886f-d56f66c39f96\") " pod="openshift-marketplace/redhat-marketplace-s257w" Jan 28 07:53:21 crc kubenswrapper[4776]: I0128 07:53:21.514482 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8783e50a-206a-45a9-886f-d56f66c39f96-utilities\") pod \"redhat-marketplace-s257w\" (UID: \"8783e50a-206a-45a9-886f-d56f66c39f96\") " pod="openshift-marketplace/redhat-marketplace-s257w" Jan 28 07:53:21 crc kubenswrapper[4776]: I0128 07:53:21.616209 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njj6l\" (UniqueName: \"kubernetes.io/projected/8783e50a-206a-45a9-886f-d56f66c39f96-kube-api-access-njj6l\") pod \"redhat-marketplace-s257w\" (UID: \"8783e50a-206a-45a9-886f-d56f66c39f96\") " pod="openshift-marketplace/redhat-marketplace-s257w" Jan 28 07:53:21 crc kubenswrapper[4776]: I0128 07:53:21.616339 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8783e50a-206a-45a9-886f-d56f66c39f96-catalog-content\") pod \"redhat-marketplace-s257w\" (UID: \"8783e50a-206a-45a9-886f-d56f66c39f96\") " pod="openshift-marketplace/redhat-marketplace-s257w" Jan 28 07:53:21 crc kubenswrapper[4776]: I0128 07:53:21.616440 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8783e50a-206a-45a9-886f-d56f66c39f96-utilities\") pod \"redhat-marketplace-s257w\" (UID: \"8783e50a-206a-45a9-886f-d56f66c39f96\") " pod="openshift-marketplace/redhat-marketplace-s257w" Jan 28 07:53:21 crc kubenswrapper[4776]: I0128 07:53:21.617058 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8783e50a-206a-45a9-886f-d56f66c39f96-catalog-content\") pod \"redhat-marketplace-s257w\" (UID: \"8783e50a-206a-45a9-886f-d56f66c39f96\") " pod="openshift-marketplace/redhat-marketplace-s257w" Jan 28 07:53:21 crc kubenswrapper[4776]: I0128 07:53:21.617120 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8783e50a-206a-45a9-886f-d56f66c39f96-utilities\") pod \"redhat-marketplace-s257w\" (UID: \"8783e50a-206a-45a9-886f-d56f66c39f96\") " pod="openshift-marketplace/redhat-marketplace-s257w" Jan 28 07:53:21 crc kubenswrapper[4776]: I0128 07:53:21.643638 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njj6l\" (UniqueName: \"kubernetes.io/projected/8783e50a-206a-45a9-886f-d56f66c39f96-kube-api-access-njj6l\") pod \"redhat-marketplace-s257w\" (UID: \"8783e50a-206a-45a9-886f-d56f66c39f96\") " pod="openshift-marketplace/redhat-marketplace-s257w" Jan 28 07:53:21 crc kubenswrapper[4776]: I0128 07:53:21.761995 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s257w" Jan 28 07:53:22 crc kubenswrapper[4776]: I0128 07:53:22.252846 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s257w"] Jan 28 07:53:22 crc kubenswrapper[4776]: I0128 07:53:22.911474 4776 generic.go:334] "Generic (PLEG): container finished" podID="8783e50a-206a-45a9-886f-d56f66c39f96" containerID="13c22c683b6b212a945b358b223d30894ef33f34b0defe687b34b766043e1f2d" exitCode=0 Jan 28 07:53:22 crc kubenswrapper[4776]: I0128 07:53:22.911580 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s257w" event={"ID":"8783e50a-206a-45a9-886f-d56f66c39f96","Type":"ContainerDied","Data":"13c22c683b6b212a945b358b223d30894ef33f34b0defe687b34b766043e1f2d"} Jan 28 07:53:22 crc kubenswrapper[4776]: I0128 07:53:22.911798 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s257w" event={"ID":"8783e50a-206a-45a9-886f-d56f66c39f96","Type":"ContainerStarted","Data":"4e552944153edf55630fa41e05394f9c960e609390ee60e6f70314a8d24ae23e"} Jan 28 07:53:22 crc kubenswrapper[4776]: I0128 07:53:22.914133 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 07:53:23 crc kubenswrapper[4776]: I0128 07:53:23.921679 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s257w" event={"ID":"8783e50a-206a-45a9-886f-d56f66c39f96","Type":"ContainerStarted","Data":"e2c50cbcb79439509f33d30479e8eec1bb99c774853e6319807887df2c470c7a"} Jan 28 07:53:24 crc kubenswrapper[4776]: I0128 07:53:24.932415 4776 generic.go:334] "Generic (PLEG): container finished" podID="8783e50a-206a-45a9-886f-d56f66c39f96" containerID="e2c50cbcb79439509f33d30479e8eec1bb99c774853e6319807887df2c470c7a" exitCode=0 Jan 28 07:53:24 crc kubenswrapper[4776]: I0128 07:53:24.932462 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s257w" event={"ID":"8783e50a-206a-45a9-886f-d56f66c39f96","Type":"ContainerDied","Data":"e2c50cbcb79439509f33d30479e8eec1bb99c774853e6319807887df2c470c7a"} Jan 28 07:53:25 crc kubenswrapper[4776]: I0128 07:53:25.944255 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s257w" event={"ID":"8783e50a-206a-45a9-886f-d56f66c39f96","Type":"ContainerStarted","Data":"99fd241767313665f61322f14a59e85dcaeefc56338517fd8d1a1e5d12532129"} Jan 28 07:53:25 crc kubenswrapper[4776]: I0128 07:53:25.980803 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s257w" podStartSLOduration=2.510223685 podStartE2EDuration="4.98078462s" podCreationTimestamp="2026-01-28 07:53:21 +0000 UTC" firstStartedPulling="2026-01-28 07:53:22.913910835 +0000 UTC m=+3774.329570995" lastFinishedPulling="2026-01-28 07:53:25.38447176 +0000 UTC m=+3776.800131930" observedRunningTime="2026-01-28 07:53:25.968697932 +0000 UTC m=+3777.384358092" watchObservedRunningTime="2026-01-28 07:53:25.98078462 +0000 UTC m=+3777.396444780" Jan 28 07:53:31 crc kubenswrapper[4776]: I0128 07:53:31.762417 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s257w" Jan 28 07:53:31 crc kubenswrapper[4776]: I0128 07:53:31.762711 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s257w" Jan 28 07:53:31 crc kubenswrapper[4776]: I0128 07:53:31.832569 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s257w" Jan 28 07:53:32 crc kubenswrapper[4776]: I0128 07:53:32.112828 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s257w" Jan 28 07:53:32 crc kubenswrapper[4776]: I0128 07:53:32.182009 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s257w"] Jan 28 07:53:33 crc kubenswrapper[4776]: I0128 07:53:33.852683 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:53:33 crc kubenswrapper[4776]: I0128 07:53:33.853074 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:53:34 crc kubenswrapper[4776]: I0128 07:53:34.063040 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s257w" podUID="8783e50a-206a-45a9-886f-d56f66c39f96" containerName="registry-server" containerID="cri-o://99fd241767313665f61322f14a59e85dcaeefc56338517fd8d1a1e5d12532129" gracePeriod=2 Jan 28 07:53:34 crc kubenswrapper[4776]: I0128 07:53:34.586866 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s257w" Jan 28 07:53:34 crc kubenswrapper[4776]: I0128 07:53:34.694293 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njj6l\" (UniqueName: \"kubernetes.io/projected/8783e50a-206a-45a9-886f-d56f66c39f96-kube-api-access-njj6l\") pod \"8783e50a-206a-45a9-886f-d56f66c39f96\" (UID: \"8783e50a-206a-45a9-886f-d56f66c39f96\") " Jan 28 07:53:34 crc kubenswrapper[4776]: I0128 07:53:34.694442 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8783e50a-206a-45a9-886f-d56f66c39f96-catalog-content\") pod \"8783e50a-206a-45a9-886f-d56f66c39f96\" (UID: \"8783e50a-206a-45a9-886f-d56f66c39f96\") " Jan 28 07:53:34 crc kubenswrapper[4776]: I0128 07:53:34.694703 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8783e50a-206a-45a9-886f-d56f66c39f96-utilities\") pod \"8783e50a-206a-45a9-886f-d56f66c39f96\" (UID: \"8783e50a-206a-45a9-886f-d56f66c39f96\") " Jan 28 07:53:34 crc kubenswrapper[4776]: I0128 07:53:34.697012 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8783e50a-206a-45a9-886f-d56f66c39f96-utilities" (OuterVolumeSpecName: "utilities") pod "8783e50a-206a-45a9-886f-d56f66c39f96" (UID: "8783e50a-206a-45a9-886f-d56f66c39f96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:53:34 crc kubenswrapper[4776]: I0128 07:53:34.706858 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8783e50a-206a-45a9-886f-d56f66c39f96-kube-api-access-njj6l" (OuterVolumeSpecName: "kube-api-access-njj6l") pod "8783e50a-206a-45a9-886f-d56f66c39f96" (UID: "8783e50a-206a-45a9-886f-d56f66c39f96"). InnerVolumeSpecName "kube-api-access-njj6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:53:34 crc kubenswrapper[4776]: I0128 07:53:34.724373 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8783e50a-206a-45a9-886f-d56f66c39f96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8783e50a-206a-45a9-886f-d56f66c39f96" (UID: "8783e50a-206a-45a9-886f-d56f66c39f96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:53:34 crc kubenswrapper[4776]: I0128 07:53:34.797478 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8783e50a-206a-45a9-886f-d56f66c39f96-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:53:34 crc kubenswrapper[4776]: I0128 07:53:34.797770 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8783e50a-206a-45a9-886f-d56f66c39f96-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:53:34 crc kubenswrapper[4776]: I0128 07:53:34.797783 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njj6l\" (UniqueName: \"kubernetes.io/projected/8783e50a-206a-45a9-886f-d56f66c39f96-kube-api-access-njj6l\") on node \"crc\" DevicePath \"\"" Jan 28 07:53:35 crc kubenswrapper[4776]: I0128 07:53:35.081502 4776 generic.go:334] "Generic (PLEG): container finished" podID="8783e50a-206a-45a9-886f-d56f66c39f96" containerID="99fd241767313665f61322f14a59e85dcaeefc56338517fd8d1a1e5d12532129" exitCode=0 Jan 28 07:53:35 crc kubenswrapper[4776]: I0128 07:53:35.081666 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s257w" event={"ID":"8783e50a-206a-45a9-886f-d56f66c39f96","Type":"ContainerDied","Data":"99fd241767313665f61322f14a59e85dcaeefc56338517fd8d1a1e5d12532129"} Jan 28 07:53:35 crc kubenswrapper[4776]: I0128 07:53:35.081724 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s257w" event={"ID":"8783e50a-206a-45a9-886f-d56f66c39f96","Type":"ContainerDied","Data":"4e552944153edf55630fa41e05394f9c960e609390ee60e6f70314a8d24ae23e"} Jan 28 07:53:35 crc kubenswrapper[4776]: I0128 07:53:35.081765 4776 scope.go:117] "RemoveContainer" containerID="99fd241767313665f61322f14a59e85dcaeefc56338517fd8d1a1e5d12532129" Jan 28 07:53:35 crc kubenswrapper[4776]: I0128 07:53:35.081863 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s257w" Jan 28 07:53:35 crc kubenswrapper[4776]: I0128 07:53:35.118120 4776 scope.go:117] "RemoveContainer" containerID="e2c50cbcb79439509f33d30479e8eec1bb99c774853e6319807887df2c470c7a" Jan 28 07:53:35 crc kubenswrapper[4776]: I0128 07:53:35.140232 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s257w"] Jan 28 07:53:35 crc kubenswrapper[4776]: I0128 07:53:35.149080 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s257w"] Jan 28 07:53:35 crc kubenswrapper[4776]: I0128 07:53:35.158277 4776 scope.go:117] "RemoveContainer" containerID="13c22c683b6b212a945b358b223d30894ef33f34b0defe687b34b766043e1f2d" Jan 28 07:53:35 crc kubenswrapper[4776]: I0128 07:53:35.211403 4776 scope.go:117] "RemoveContainer" containerID="99fd241767313665f61322f14a59e85dcaeefc56338517fd8d1a1e5d12532129" Jan 28 07:53:35 crc kubenswrapper[4776]: E0128 07:53:35.211900 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99fd241767313665f61322f14a59e85dcaeefc56338517fd8d1a1e5d12532129\": container with ID starting with 99fd241767313665f61322f14a59e85dcaeefc56338517fd8d1a1e5d12532129 not found: ID does not exist" containerID="99fd241767313665f61322f14a59e85dcaeefc56338517fd8d1a1e5d12532129" Jan 28 07:53:35 crc kubenswrapper[4776]: I0128 07:53:35.211963 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99fd241767313665f61322f14a59e85dcaeefc56338517fd8d1a1e5d12532129"} err="failed to get container status \"99fd241767313665f61322f14a59e85dcaeefc56338517fd8d1a1e5d12532129\": rpc error: code = NotFound desc = could not find container \"99fd241767313665f61322f14a59e85dcaeefc56338517fd8d1a1e5d12532129\": container with ID starting with 99fd241767313665f61322f14a59e85dcaeefc56338517fd8d1a1e5d12532129 not found: ID does not exist" Jan 28 07:53:35 crc kubenswrapper[4776]: I0128 07:53:35.211998 4776 scope.go:117] "RemoveContainer" containerID="e2c50cbcb79439509f33d30479e8eec1bb99c774853e6319807887df2c470c7a" Jan 28 07:53:35 crc kubenswrapper[4776]: E0128 07:53:35.212345 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2c50cbcb79439509f33d30479e8eec1bb99c774853e6319807887df2c470c7a\": container with ID starting with e2c50cbcb79439509f33d30479e8eec1bb99c774853e6319807887df2c470c7a not found: ID does not exist" containerID="e2c50cbcb79439509f33d30479e8eec1bb99c774853e6319807887df2c470c7a" Jan 28 07:53:35 crc kubenswrapper[4776]: I0128 07:53:35.212382 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2c50cbcb79439509f33d30479e8eec1bb99c774853e6319807887df2c470c7a"} err="failed to get container status \"e2c50cbcb79439509f33d30479e8eec1bb99c774853e6319807887df2c470c7a\": rpc error: code = NotFound desc = could not find container \"e2c50cbcb79439509f33d30479e8eec1bb99c774853e6319807887df2c470c7a\": container with ID starting with e2c50cbcb79439509f33d30479e8eec1bb99c774853e6319807887df2c470c7a not found: ID does not exist" Jan 28 07:53:35 crc kubenswrapper[4776]: I0128 07:53:35.212403 4776 scope.go:117] "RemoveContainer" containerID="13c22c683b6b212a945b358b223d30894ef33f34b0defe687b34b766043e1f2d" Jan 28 07:53:35 crc kubenswrapper[4776]: E0128 07:53:35.213234 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13c22c683b6b212a945b358b223d30894ef33f34b0defe687b34b766043e1f2d\": container with ID starting with 13c22c683b6b212a945b358b223d30894ef33f34b0defe687b34b766043e1f2d not found: ID does not exist" containerID="13c22c683b6b212a945b358b223d30894ef33f34b0defe687b34b766043e1f2d" Jan 28 07:53:35 crc kubenswrapper[4776]: I0128 07:53:35.213391 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13c22c683b6b212a945b358b223d30894ef33f34b0defe687b34b766043e1f2d"} err="failed to get container status \"13c22c683b6b212a945b358b223d30894ef33f34b0defe687b34b766043e1f2d\": rpc error: code = NotFound desc = could not find container \"13c22c683b6b212a945b358b223d30894ef33f34b0defe687b34b766043e1f2d\": container with ID starting with 13c22c683b6b212a945b358b223d30894ef33f34b0defe687b34b766043e1f2d not found: ID does not exist" Jan 28 07:53:35 crc kubenswrapper[4776]: I0128 07:53:35.316018 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8783e50a-206a-45a9-886f-d56f66c39f96" path="/var/lib/kubelet/pods/8783e50a-206a-45a9-886f-d56f66c39f96/volumes" Jan 28 07:54:03 crc kubenswrapper[4776]: I0128 07:54:03.852743 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:54:03 crc kubenswrapper[4776]: I0128 07:54:03.853394 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:54:33 crc kubenswrapper[4776]: I0128 07:54:33.852410 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 07:54:33 crc kubenswrapper[4776]: I0128 07:54:33.852996 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 07:54:33 crc kubenswrapper[4776]: I0128 07:54:33.853047 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 07:54:33 crc kubenswrapper[4776]: I0128 07:54:33.853840 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 07:54:33 crc kubenswrapper[4776]: I0128 07:54:33.853904 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" gracePeriod=600 Jan 28 07:54:33 crc kubenswrapper[4776]: E0128 07:54:33.982927 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:54:34 crc kubenswrapper[4776]: I0128 07:54:34.798794 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" exitCode=0 Jan 28 07:54:34 crc kubenswrapper[4776]: I0128 07:54:34.798838 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12"} Jan 28 07:54:34 crc kubenswrapper[4776]: I0128 07:54:34.798869 4776 scope.go:117] "RemoveContainer" containerID="432f15ce1d09416b3615cc6d6d550416087724975f99751c0f2925985b3eadda" Jan 28 07:54:34 crc kubenswrapper[4776]: I0128 07:54:34.799503 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:54:34 crc kubenswrapper[4776]: E0128 07:54:34.799938 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:54:50 crc kubenswrapper[4776]: I0128 07:54:50.304275 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:54:50 crc kubenswrapper[4776]: E0128 07:54:50.304964 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:55:05 crc kubenswrapper[4776]: I0128 07:55:05.305972 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:55:05 crc kubenswrapper[4776]: E0128 07:55:05.307266 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:55:16 crc kubenswrapper[4776]: I0128 07:55:16.305440 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:55:16 crc kubenswrapper[4776]: E0128 07:55:16.307650 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:55:28 crc kubenswrapper[4776]: I0128 07:55:28.304403 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:55:28 crc kubenswrapper[4776]: E0128 07:55:28.305232 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:55:43 crc kubenswrapper[4776]: I0128 07:55:43.305469 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:55:43 crc kubenswrapper[4776]: E0128 07:55:43.306244 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:55:57 crc kubenswrapper[4776]: I0128 07:55:57.305333 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:55:57 crc kubenswrapper[4776]: E0128 07:55:57.306673 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:56:12 crc kubenswrapper[4776]: I0128 07:56:12.305088 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:56:12 crc kubenswrapper[4776]: E0128 07:56:12.306076 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:56:23 crc kubenswrapper[4776]: I0128 07:56:23.305621 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:56:23 crc kubenswrapper[4776]: E0128 07:56:23.308006 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:56:35 crc kubenswrapper[4776]: I0128 07:56:35.307329 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:56:35 crc kubenswrapper[4776]: E0128 07:56:35.309803 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:56:48 crc kubenswrapper[4776]: I0128 07:56:48.304892 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:56:48 crc kubenswrapper[4776]: E0128 07:56:48.305676 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:57:01 crc kubenswrapper[4776]: I0128 07:57:01.304926 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:57:01 crc kubenswrapper[4776]: E0128 07:57:01.310190 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:57:12 crc kubenswrapper[4776]: I0128 07:57:12.305822 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:57:12 crc kubenswrapper[4776]: E0128 07:57:12.306787 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:57:26 crc kubenswrapper[4776]: I0128 07:57:26.305041 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:57:26 crc kubenswrapper[4776]: E0128 07:57:26.305922 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:57:38 crc kubenswrapper[4776]: I0128 07:57:38.305362 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:57:38 crc kubenswrapper[4776]: E0128 07:57:38.306375 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:57:53 crc kubenswrapper[4776]: I0128 07:57:53.305882 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:57:53 crc kubenswrapper[4776]: E0128 07:57:53.307165 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:58:05 crc kubenswrapper[4776]: I0128 07:58:05.305443 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:58:05 crc kubenswrapper[4776]: E0128 07:58:05.306211 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.069692 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-82xql"] Jan 28 07:58:15 crc kubenswrapper[4776]: E0128 07:58:15.071849 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8783e50a-206a-45a9-886f-d56f66c39f96" containerName="registry-server" Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.071969 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8783e50a-206a-45a9-886f-d56f66c39f96" containerName="registry-server" Jan 28 07:58:15 crc kubenswrapper[4776]: E0128 07:58:15.072063 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8783e50a-206a-45a9-886f-d56f66c39f96" containerName="extract-utilities" Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.072151 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8783e50a-206a-45a9-886f-d56f66c39f96" containerName="extract-utilities" Jan 28 07:58:15 crc kubenswrapper[4776]: E0128 07:58:15.072255 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8783e50a-206a-45a9-886f-d56f66c39f96" containerName="extract-content" Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.072340 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8783e50a-206a-45a9-886f-d56f66c39f96" containerName="extract-content" Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.072684 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8783e50a-206a-45a9-886f-d56f66c39f96" containerName="registry-server" Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.074897 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82xql" Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.090810 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-82xql"] Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.224290 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abb32b75-3393-47dc-a543-bfa5745c4ec6-catalog-content\") pod \"community-operators-82xql\" (UID: \"abb32b75-3393-47dc-a543-bfa5745c4ec6\") " pod="openshift-marketplace/community-operators-82xql" Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.224460 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqd5k\" (UniqueName: \"kubernetes.io/projected/abb32b75-3393-47dc-a543-bfa5745c4ec6-kube-api-access-dqd5k\") pod \"community-operators-82xql\" (UID: \"abb32b75-3393-47dc-a543-bfa5745c4ec6\") " pod="openshift-marketplace/community-operators-82xql" Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.224595 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abb32b75-3393-47dc-a543-bfa5745c4ec6-utilities\") pod \"community-operators-82xql\" (UID: \"abb32b75-3393-47dc-a543-bfa5745c4ec6\") " pod="openshift-marketplace/community-operators-82xql" Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.325922 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abb32b75-3393-47dc-a543-bfa5745c4ec6-utilities\") pod \"community-operators-82xql\" (UID: \"abb32b75-3393-47dc-a543-bfa5745c4ec6\") " pod="openshift-marketplace/community-operators-82xql" Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.325999 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abb32b75-3393-47dc-a543-bfa5745c4ec6-catalog-content\") pod \"community-operators-82xql\" (UID: \"abb32b75-3393-47dc-a543-bfa5745c4ec6\") " pod="openshift-marketplace/community-operators-82xql" Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.326111 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqd5k\" (UniqueName: \"kubernetes.io/projected/abb32b75-3393-47dc-a543-bfa5745c4ec6-kube-api-access-dqd5k\") pod \"community-operators-82xql\" (UID: \"abb32b75-3393-47dc-a543-bfa5745c4ec6\") " pod="openshift-marketplace/community-operators-82xql" Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.326447 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abb32b75-3393-47dc-a543-bfa5745c4ec6-utilities\") pod \"community-operators-82xql\" (UID: \"abb32b75-3393-47dc-a543-bfa5745c4ec6\") " pod="openshift-marketplace/community-operators-82xql" Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.326590 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abb32b75-3393-47dc-a543-bfa5745c4ec6-catalog-content\") pod \"community-operators-82xql\" (UID: \"abb32b75-3393-47dc-a543-bfa5745c4ec6\") " pod="openshift-marketplace/community-operators-82xql" Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.350223 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqd5k\" (UniqueName: \"kubernetes.io/projected/abb32b75-3393-47dc-a543-bfa5745c4ec6-kube-api-access-dqd5k\") pod \"community-operators-82xql\" (UID: \"abb32b75-3393-47dc-a543-bfa5745c4ec6\") " pod="openshift-marketplace/community-operators-82xql" Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.398100 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82xql" Jan 28 07:58:15 crc kubenswrapper[4776]: I0128 07:58:15.809660 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-82xql"] Jan 28 07:58:16 crc kubenswrapper[4776]: I0128 07:58:16.158177 4776 generic.go:334] "Generic (PLEG): container finished" podID="abb32b75-3393-47dc-a543-bfa5745c4ec6" containerID="c8657f6f7ede6b2f4ed736217d8279211a49c0319ede77be156b644c3c067058" exitCode=0 Jan 28 07:58:16 crc kubenswrapper[4776]: I0128 07:58:16.158230 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82xql" event={"ID":"abb32b75-3393-47dc-a543-bfa5745c4ec6","Type":"ContainerDied","Data":"c8657f6f7ede6b2f4ed736217d8279211a49c0319ede77be156b644c3c067058"} Jan 28 07:58:16 crc kubenswrapper[4776]: I0128 07:58:16.158427 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82xql" event={"ID":"abb32b75-3393-47dc-a543-bfa5745c4ec6","Type":"ContainerStarted","Data":"a5ea6dcb9f63f56aa16915108bf7a4c3e023618f031c2e8c11bb7df31eb07935"} Jan 28 07:58:17 crc kubenswrapper[4776]: I0128 07:58:17.469733 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-289qg"] Jan 28 07:58:17 crc kubenswrapper[4776]: I0128 07:58:17.473378 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-289qg" Jan 28 07:58:17 crc kubenswrapper[4776]: I0128 07:58:17.482451 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-289qg"] Jan 28 07:58:17 crc kubenswrapper[4776]: I0128 07:58:17.597791 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-utilities\") pod \"certified-operators-289qg\" (UID: \"a28da1a7-88a8-40d7-86fc-ab285f6d82ee\") " pod="openshift-marketplace/certified-operators-289qg" Jan 28 07:58:17 crc kubenswrapper[4776]: I0128 07:58:17.597885 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-catalog-content\") pod \"certified-operators-289qg\" (UID: \"a28da1a7-88a8-40d7-86fc-ab285f6d82ee\") " pod="openshift-marketplace/certified-operators-289qg" Jan 28 07:58:17 crc kubenswrapper[4776]: I0128 07:58:17.597958 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9682\" (UniqueName: \"kubernetes.io/projected/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-kube-api-access-j9682\") pod \"certified-operators-289qg\" (UID: \"a28da1a7-88a8-40d7-86fc-ab285f6d82ee\") " pod="openshift-marketplace/certified-operators-289qg" Jan 28 07:58:17 crc kubenswrapper[4776]: I0128 07:58:17.699447 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-utilities\") pod \"certified-operators-289qg\" (UID: \"a28da1a7-88a8-40d7-86fc-ab285f6d82ee\") " pod="openshift-marketplace/certified-operators-289qg" Jan 28 07:58:17 crc kubenswrapper[4776]: I0128 07:58:17.699650 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-catalog-content\") pod \"certified-operators-289qg\" (UID: \"a28da1a7-88a8-40d7-86fc-ab285f6d82ee\") " pod="openshift-marketplace/certified-operators-289qg" Jan 28 07:58:17 crc kubenswrapper[4776]: I0128 07:58:17.699763 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9682\" (UniqueName: \"kubernetes.io/projected/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-kube-api-access-j9682\") pod \"certified-operators-289qg\" (UID: \"a28da1a7-88a8-40d7-86fc-ab285f6d82ee\") " pod="openshift-marketplace/certified-operators-289qg" Jan 28 07:58:17 crc kubenswrapper[4776]: I0128 07:58:17.700150 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-catalog-content\") pod \"certified-operators-289qg\" (UID: \"a28da1a7-88a8-40d7-86fc-ab285f6d82ee\") " pod="openshift-marketplace/certified-operators-289qg" Jan 28 07:58:17 crc kubenswrapper[4776]: I0128 07:58:17.700175 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-utilities\") pod \"certified-operators-289qg\" (UID: \"a28da1a7-88a8-40d7-86fc-ab285f6d82ee\") " pod="openshift-marketplace/certified-operators-289qg" Jan 28 07:58:17 crc kubenswrapper[4776]: I0128 07:58:17.722502 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9682\" (UniqueName: \"kubernetes.io/projected/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-kube-api-access-j9682\") pod \"certified-operators-289qg\" (UID: \"a28da1a7-88a8-40d7-86fc-ab285f6d82ee\") " pod="openshift-marketplace/certified-operators-289qg" Jan 28 07:58:17 crc kubenswrapper[4776]: I0128 07:58:17.813841 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-289qg" Jan 28 07:58:18 crc kubenswrapper[4776]: I0128 07:58:18.304901 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:58:18 crc kubenswrapper[4776]: E0128 07:58:18.305584 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:58:18 crc kubenswrapper[4776]: I0128 07:58:18.438799 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-289qg"] Jan 28 07:58:18 crc kubenswrapper[4776]: W0128 07:58:18.447817 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda28da1a7_88a8_40d7_86fc_ab285f6d82ee.slice/crio-72aa76e648fc93718815be170fcb0a582e351f09a1d2416117c8ed387dd65ada WatchSource:0}: Error finding container 72aa76e648fc93718815be170fcb0a582e351f09a1d2416117c8ed387dd65ada: Status 404 returned error can't find the container with id 72aa76e648fc93718815be170fcb0a582e351f09a1d2416117c8ed387dd65ada Jan 28 07:58:19 crc kubenswrapper[4776]: I0128 07:58:19.185907 4776 generic.go:334] "Generic (PLEG): container finished" podID="a28da1a7-88a8-40d7-86fc-ab285f6d82ee" containerID="1ef3e8d0700f26a64d448b317f55347483aa317b3cb9dd09a130db35271a65b7" exitCode=0 Jan 28 07:58:19 crc kubenswrapper[4776]: I0128 07:58:19.186233 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-289qg" event={"ID":"a28da1a7-88a8-40d7-86fc-ab285f6d82ee","Type":"ContainerDied","Data":"1ef3e8d0700f26a64d448b317f55347483aa317b3cb9dd09a130db35271a65b7"} Jan 28 07:58:19 crc kubenswrapper[4776]: I0128 07:58:19.186265 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-289qg" event={"ID":"a28da1a7-88a8-40d7-86fc-ab285f6d82ee","Type":"ContainerStarted","Data":"72aa76e648fc93718815be170fcb0a582e351f09a1d2416117c8ed387dd65ada"} Jan 28 07:58:22 crc kubenswrapper[4776]: I0128 07:58:22.247992 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-289qg" event={"ID":"a28da1a7-88a8-40d7-86fc-ab285f6d82ee","Type":"ContainerStarted","Data":"c59204aebbb9412bcc8bc4daae9061b21e48d2bd277129fded90cd00bc5dc40d"} Jan 28 07:58:22 crc kubenswrapper[4776]: I0128 07:58:22.251299 4776 generic.go:334] "Generic (PLEG): container finished" podID="abb32b75-3393-47dc-a543-bfa5745c4ec6" containerID="8dba9c9adde708299edbc681aa52e6026ad84829b8afef291932fb0efac56623" exitCode=0 Jan 28 07:58:22 crc kubenswrapper[4776]: I0128 07:58:22.251352 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82xql" event={"ID":"abb32b75-3393-47dc-a543-bfa5745c4ec6","Type":"ContainerDied","Data":"8dba9c9adde708299edbc681aa52e6026ad84829b8afef291932fb0efac56623"} Jan 28 07:58:23 crc kubenswrapper[4776]: I0128 07:58:23.270464 4776 generic.go:334] "Generic (PLEG): container finished" podID="a28da1a7-88a8-40d7-86fc-ab285f6d82ee" containerID="c59204aebbb9412bcc8bc4daae9061b21e48d2bd277129fded90cd00bc5dc40d" exitCode=0 Jan 28 07:58:23 crc kubenswrapper[4776]: I0128 07:58:23.270587 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-289qg" event={"ID":"a28da1a7-88a8-40d7-86fc-ab285f6d82ee","Type":"ContainerDied","Data":"c59204aebbb9412bcc8bc4daae9061b21e48d2bd277129fded90cd00bc5dc40d"} Jan 28 07:58:23 crc kubenswrapper[4776]: I0128 07:58:23.273898 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 07:58:23 crc kubenswrapper[4776]: I0128 07:58:23.274717 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82xql" event={"ID":"abb32b75-3393-47dc-a543-bfa5745c4ec6","Type":"ContainerStarted","Data":"9082dafd7054d2516144bfb330b416ad8eb4d1f8e2087eb70d9abb9c5441e389"} Jan 28 07:58:23 crc kubenswrapper[4776]: I0128 07:58:23.332247 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-82xql" podStartSLOduration=1.714230581 podStartE2EDuration="8.332218825s" podCreationTimestamp="2026-01-28 07:58:15 +0000 UTC" firstStartedPulling="2026-01-28 07:58:16.160793203 +0000 UTC m=+4067.576453373" lastFinishedPulling="2026-01-28 07:58:22.778781457 +0000 UTC m=+4074.194441617" observedRunningTime="2026-01-28 07:58:23.321930026 +0000 UTC m=+4074.737590186" watchObservedRunningTime="2026-01-28 07:58:23.332218825 +0000 UTC m=+4074.747879005" Jan 28 07:58:24 crc kubenswrapper[4776]: I0128 07:58:24.286626 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-289qg" event={"ID":"a28da1a7-88a8-40d7-86fc-ab285f6d82ee","Type":"ContainerStarted","Data":"ea4a9aeb5e78cfe42a2d260f1418a0cd4a5b07c6f94486f7b3cf355773b9b7c4"} Jan 28 07:58:24 crc kubenswrapper[4776]: I0128 07:58:24.321639 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-289qg" podStartSLOduration=4.686214406 podStartE2EDuration="7.321621738s" podCreationTimestamp="2026-01-28 07:58:17 +0000 UTC" firstStartedPulling="2026-01-28 07:58:21.087540272 +0000 UTC m=+4072.503200462" lastFinishedPulling="2026-01-28 07:58:23.722947634 +0000 UTC m=+4075.138607794" observedRunningTime="2026-01-28 07:58:24.313226541 +0000 UTC m=+4075.728886711" watchObservedRunningTime="2026-01-28 07:58:24.321621738 +0000 UTC m=+4075.737281898" Jan 28 07:58:25 crc kubenswrapper[4776]: I0128 07:58:25.398407 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-82xql" Jan 28 07:58:25 crc kubenswrapper[4776]: I0128 07:58:25.398483 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-82xql" Jan 28 07:58:25 crc kubenswrapper[4776]: I0128 07:58:25.454567 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-82xql" Jan 28 07:58:27 crc kubenswrapper[4776]: I0128 07:58:27.815694 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-289qg" Jan 28 07:58:27 crc kubenswrapper[4776]: I0128 07:58:27.816189 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-289qg" Jan 28 07:58:27 crc kubenswrapper[4776]: I0128 07:58:27.909082 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-289qg" Jan 28 07:58:28 crc kubenswrapper[4776]: I0128 07:58:28.395913 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-289qg" Jan 28 07:58:28 crc kubenswrapper[4776]: I0128 07:58:28.662644 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-289qg"] Jan 28 07:58:30 crc kubenswrapper[4776]: I0128 07:58:30.304719 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:58:30 crc kubenswrapper[4776]: E0128 07:58:30.305613 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:58:30 crc kubenswrapper[4776]: I0128 07:58:30.349917 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-289qg" podUID="a28da1a7-88a8-40d7-86fc-ab285f6d82ee" containerName="registry-server" containerID="cri-o://ea4a9aeb5e78cfe42a2d260f1418a0cd4a5b07c6f94486f7b3cf355773b9b7c4" gracePeriod=2 Jan 28 07:58:30 crc kubenswrapper[4776]: I0128 07:58:30.895904 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-289qg" Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.030328 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9682\" (UniqueName: \"kubernetes.io/projected/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-kube-api-access-j9682\") pod \"a28da1a7-88a8-40d7-86fc-ab285f6d82ee\" (UID: \"a28da1a7-88a8-40d7-86fc-ab285f6d82ee\") " Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.030608 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-utilities\") pod \"a28da1a7-88a8-40d7-86fc-ab285f6d82ee\" (UID: \"a28da1a7-88a8-40d7-86fc-ab285f6d82ee\") " Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.030667 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-catalog-content\") pod \"a28da1a7-88a8-40d7-86fc-ab285f6d82ee\" (UID: \"a28da1a7-88a8-40d7-86fc-ab285f6d82ee\") " Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.034920 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-utilities" (OuterVolumeSpecName: "utilities") pod "a28da1a7-88a8-40d7-86fc-ab285f6d82ee" (UID: "a28da1a7-88a8-40d7-86fc-ab285f6d82ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.040473 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-kube-api-access-j9682" (OuterVolumeSpecName: "kube-api-access-j9682") pod "a28da1a7-88a8-40d7-86fc-ab285f6d82ee" (UID: "a28da1a7-88a8-40d7-86fc-ab285f6d82ee"). InnerVolumeSpecName "kube-api-access-j9682". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.082283 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a28da1a7-88a8-40d7-86fc-ab285f6d82ee" (UID: "a28da1a7-88a8-40d7-86fc-ab285f6d82ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.133129 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.133156 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.133166 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9682\" (UniqueName: \"kubernetes.io/projected/a28da1a7-88a8-40d7-86fc-ab285f6d82ee-kube-api-access-j9682\") on node \"crc\" DevicePath \"\"" Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.362221 4776 generic.go:334] "Generic (PLEG): container finished" podID="a28da1a7-88a8-40d7-86fc-ab285f6d82ee" containerID="ea4a9aeb5e78cfe42a2d260f1418a0cd4a5b07c6f94486f7b3cf355773b9b7c4" exitCode=0 Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.362265 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-289qg" event={"ID":"a28da1a7-88a8-40d7-86fc-ab285f6d82ee","Type":"ContainerDied","Data":"ea4a9aeb5e78cfe42a2d260f1418a0cd4a5b07c6f94486f7b3cf355773b9b7c4"} Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.362368 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-289qg" event={"ID":"a28da1a7-88a8-40d7-86fc-ab285f6d82ee","Type":"ContainerDied","Data":"72aa76e648fc93718815be170fcb0a582e351f09a1d2416117c8ed387dd65ada"} Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.362402 4776 scope.go:117] "RemoveContainer" containerID="ea4a9aeb5e78cfe42a2d260f1418a0cd4a5b07c6f94486f7b3cf355773b9b7c4" Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.362282 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-289qg" Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.399191 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-289qg"] Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.403701 4776 scope.go:117] "RemoveContainer" containerID="c59204aebbb9412bcc8bc4daae9061b21e48d2bd277129fded90cd00bc5dc40d" Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.410767 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-289qg"] Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.438409 4776 scope.go:117] "RemoveContainer" containerID="1ef3e8d0700f26a64d448b317f55347483aa317b3cb9dd09a130db35271a65b7" Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.481482 4776 scope.go:117] "RemoveContainer" containerID="ea4a9aeb5e78cfe42a2d260f1418a0cd4a5b07c6f94486f7b3cf355773b9b7c4" Jan 28 07:58:31 crc kubenswrapper[4776]: E0128 07:58:31.481967 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea4a9aeb5e78cfe42a2d260f1418a0cd4a5b07c6f94486f7b3cf355773b9b7c4\": container with ID starting with ea4a9aeb5e78cfe42a2d260f1418a0cd4a5b07c6f94486f7b3cf355773b9b7c4 not found: ID does not exist" containerID="ea4a9aeb5e78cfe42a2d260f1418a0cd4a5b07c6f94486f7b3cf355773b9b7c4" Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.482016 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea4a9aeb5e78cfe42a2d260f1418a0cd4a5b07c6f94486f7b3cf355773b9b7c4"} err="failed to get container status \"ea4a9aeb5e78cfe42a2d260f1418a0cd4a5b07c6f94486f7b3cf355773b9b7c4\": rpc error: code = NotFound desc = could not find container \"ea4a9aeb5e78cfe42a2d260f1418a0cd4a5b07c6f94486f7b3cf355773b9b7c4\": container with ID starting with ea4a9aeb5e78cfe42a2d260f1418a0cd4a5b07c6f94486f7b3cf355773b9b7c4 not found: ID does not exist" Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.482050 4776 scope.go:117] "RemoveContainer" containerID="c59204aebbb9412bcc8bc4daae9061b21e48d2bd277129fded90cd00bc5dc40d" Jan 28 07:58:31 crc kubenswrapper[4776]: E0128 07:58:31.482464 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c59204aebbb9412bcc8bc4daae9061b21e48d2bd277129fded90cd00bc5dc40d\": container with ID starting with c59204aebbb9412bcc8bc4daae9061b21e48d2bd277129fded90cd00bc5dc40d not found: ID does not exist" containerID="c59204aebbb9412bcc8bc4daae9061b21e48d2bd277129fded90cd00bc5dc40d" Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.482539 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c59204aebbb9412bcc8bc4daae9061b21e48d2bd277129fded90cd00bc5dc40d"} err="failed to get container status \"c59204aebbb9412bcc8bc4daae9061b21e48d2bd277129fded90cd00bc5dc40d\": rpc error: code = NotFound desc = could not find container \"c59204aebbb9412bcc8bc4daae9061b21e48d2bd277129fded90cd00bc5dc40d\": container with ID starting with c59204aebbb9412bcc8bc4daae9061b21e48d2bd277129fded90cd00bc5dc40d not found: ID does not exist" Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.482616 4776 scope.go:117] "RemoveContainer" containerID="1ef3e8d0700f26a64d448b317f55347483aa317b3cb9dd09a130db35271a65b7" Jan 28 07:58:31 crc kubenswrapper[4776]: E0128 07:58:31.483047 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ef3e8d0700f26a64d448b317f55347483aa317b3cb9dd09a130db35271a65b7\": container with ID starting with 1ef3e8d0700f26a64d448b317f55347483aa317b3cb9dd09a130db35271a65b7 not found: ID does not exist" containerID="1ef3e8d0700f26a64d448b317f55347483aa317b3cb9dd09a130db35271a65b7" Jan 28 07:58:31 crc kubenswrapper[4776]: I0128 07:58:31.483090 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef3e8d0700f26a64d448b317f55347483aa317b3cb9dd09a130db35271a65b7"} err="failed to get container status \"1ef3e8d0700f26a64d448b317f55347483aa317b3cb9dd09a130db35271a65b7\": rpc error: code = NotFound desc = could not find container \"1ef3e8d0700f26a64d448b317f55347483aa317b3cb9dd09a130db35271a65b7\": container with ID starting with 1ef3e8d0700f26a64d448b317f55347483aa317b3cb9dd09a130db35271a65b7 not found: ID does not exist" Jan 28 07:58:33 crc kubenswrapper[4776]: I0128 07:58:33.322658 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a28da1a7-88a8-40d7-86fc-ab285f6d82ee" path="/var/lib/kubelet/pods/a28da1a7-88a8-40d7-86fc-ab285f6d82ee/volumes" Jan 28 07:58:35 crc kubenswrapper[4776]: I0128 07:58:35.446271 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-82xql" Jan 28 07:58:36 crc kubenswrapper[4776]: I0128 07:58:36.297364 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-82xql"] Jan 28 07:58:36 crc kubenswrapper[4776]: I0128 07:58:36.468819 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bl4vb"] Jan 28 07:58:36 crc kubenswrapper[4776]: I0128 07:58:36.469239 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bl4vb" podUID="5b09add3-701c-4c0e-ac04-0e974a7bec0d" containerName="registry-server" containerID="cri-o://f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c" gracePeriod=2 Jan 28 07:58:36 crc kubenswrapper[4776]: E0128 07:58:36.867666 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c is running failed: container process not found" containerID="f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c" cmd=["grpc_health_probe","-addr=:50051"] Jan 28 07:58:36 crc kubenswrapper[4776]: E0128 07:58:36.868197 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c is running failed: container process not found" containerID="f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c" cmd=["grpc_health_probe","-addr=:50051"] Jan 28 07:58:36 crc kubenswrapper[4776]: E0128 07:58:36.868735 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c is running failed: container process not found" containerID="f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c" cmd=["grpc_health_probe","-addr=:50051"] Jan 28 07:58:36 crc kubenswrapper[4776]: E0128 07:58:36.868790 4776 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-bl4vb" podUID="5b09add3-701c-4c0e-ac04-0e974a7bec0d" containerName="registry-server" Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.013926 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bl4vb" Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.158722 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b09add3-701c-4c0e-ac04-0e974a7bec0d-catalog-content\") pod \"5b09add3-701c-4c0e-ac04-0e974a7bec0d\" (UID: \"5b09add3-701c-4c0e-ac04-0e974a7bec0d\") " Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.158820 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b09add3-701c-4c0e-ac04-0e974a7bec0d-utilities\") pod \"5b09add3-701c-4c0e-ac04-0e974a7bec0d\" (UID: \"5b09add3-701c-4c0e-ac04-0e974a7bec0d\") " Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.158982 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znv7v\" (UniqueName: \"kubernetes.io/projected/5b09add3-701c-4c0e-ac04-0e974a7bec0d-kube-api-access-znv7v\") pod \"5b09add3-701c-4c0e-ac04-0e974a7bec0d\" (UID: \"5b09add3-701c-4c0e-ac04-0e974a7bec0d\") " Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.159441 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b09add3-701c-4c0e-ac04-0e974a7bec0d-utilities" (OuterVolumeSpecName: "utilities") pod "5b09add3-701c-4c0e-ac04-0e974a7bec0d" (UID: "5b09add3-701c-4c0e-ac04-0e974a7bec0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.159651 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b09add3-701c-4c0e-ac04-0e974a7bec0d-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.164647 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b09add3-701c-4c0e-ac04-0e974a7bec0d-kube-api-access-znv7v" (OuterVolumeSpecName: "kube-api-access-znv7v") pod "5b09add3-701c-4c0e-ac04-0e974a7bec0d" (UID: "5b09add3-701c-4c0e-ac04-0e974a7bec0d"). InnerVolumeSpecName "kube-api-access-znv7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.207789 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b09add3-701c-4c0e-ac04-0e974a7bec0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b09add3-701c-4c0e-ac04-0e974a7bec0d" (UID: "5b09add3-701c-4c0e-ac04-0e974a7bec0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.261226 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znv7v\" (UniqueName: \"kubernetes.io/projected/5b09add3-701c-4c0e-ac04-0e974a7bec0d-kube-api-access-znv7v\") on node \"crc\" DevicePath \"\"" Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.261264 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b09add3-701c-4c0e-ac04-0e974a7bec0d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.428445 4776 generic.go:334] "Generic (PLEG): container finished" podID="5b09add3-701c-4c0e-ac04-0e974a7bec0d" containerID="f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c" exitCode=0 Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.428501 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bl4vb" Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.428503 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bl4vb" event={"ID":"5b09add3-701c-4c0e-ac04-0e974a7bec0d","Type":"ContainerDied","Data":"f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c"} Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.428587 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bl4vb" event={"ID":"5b09add3-701c-4c0e-ac04-0e974a7bec0d","Type":"ContainerDied","Data":"73864fa6278ecb5fd1b76f457ec6a06d1fa41c1791cb6bd32ccdce62f264b6d2"} Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.428609 4776 scope.go:117] "RemoveContainer" containerID="f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c" Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.465744 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bl4vb"] Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.477715 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bl4vb"] Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.480255 4776 scope.go:117] "RemoveContainer" containerID="ed4050dcab7d890d120de3a02d841159d88700cdd605ed44463f4c6aa3d1085b" Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.517847 4776 scope.go:117] "RemoveContainer" containerID="26da24f977272065c55f13c02f6e6f5f6c4d4e9b2587dbc19e663f0d5438f7d9" Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.572230 4776 scope.go:117] "RemoveContainer" containerID="f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c" Jan 28 07:58:37 crc kubenswrapper[4776]: E0128 07:58:37.572585 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c\": container with ID starting with f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c not found: ID does not exist" containerID="f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c" Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.572619 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c"} err="failed to get container status \"f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c\": rpc error: code = NotFound desc = could not find container \"f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c\": container with ID starting with f61102eac1491b2fd38eb2a4236a6db38c5379c742088bbb2852e3cd504e001c not found: ID does not exist" Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.572638 4776 scope.go:117] "RemoveContainer" containerID="ed4050dcab7d890d120de3a02d841159d88700cdd605ed44463f4c6aa3d1085b" Jan 28 07:58:37 crc kubenswrapper[4776]: E0128 07:58:37.573012 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed4050dcab7d890d120de3a02d841159d88700cdd605ed44463f4c6aa3d1085b\": container with ID starting with ed4050dcab7d890d120de3a02d841159d88700cdd605ed44463f4c6aa3d1085b not found: ID does not exist" containerID="ed4050dcab7d890d120de3a02d841159d88700cdd605ed44463f4c6aa3d1085b" Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.573048 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed4050dcab7d890d120de3a02d841159d88700cdd605ed44463f4c6aa3d1085b"} err="failed to get container status \"ed4050dcab7d890d120de3a02d841159d88700cdd605ed44463f4c6aa3d1085b\": rpc error: code = NotFound desc = could not find container \"ed4050dcab7d890d120de3a02d841159d88700cdd605ed44463f4c6aa3d1085b\": container with ID starting with ed4050dcab7d890d120de3a02d841159d88700cdd605ed44463f4c6aa3d1085b not found: ID does not exist" Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.573063 4776 scope.go:117] "RemoveContainer" containerID="26da24f977272065c55f13c02f6e6f5f6c4d4e9b2587dbc19e663f0d5438f7d9" Jan 28 07:58:37 crc kubenswrapper[4776]: E0128 07:58:37.573326 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26da24f977272065c55f13c02f6e6f5f6c4d4e9b2587dbc19e663f0d5438f7d9\": container with ID starting with 26da24f977272065c55f13c02f6e6f5f6c4d4e9b2587dbc19e663f0d5438f7d9 not found: ID does not exist" containerID="26da24f977272065c55f13c02f6e6f5f6c4d4e9b2587dbc19e663f0d5438f7d9" Jan 28 07:58:37 crc kubenswrapper[4776]: I0128 07:58:37.573347 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26da24f977272065c55f13c02f6e6f5f6c4d4e9b2587dbc19e663f0d5438f7d9"} err="failed to get container status \"26da24f977272065c55f13c02f6e6f5f6c4d4e9b2587dbc19e663f0d5438f7d9\": rpc error: code = NotFound desc = could not find container \"26da24f977272065c55f13c02f6e6f5f6c4d4e9b2587dbc19e663f0d5438f7d9\": container with ID starting with 26da24f977272065c55f13c02f6e6f5f6c4d4e9b2587dbc19e663f0d5438f7d9 not found: ID does not exist" Jan 28 07:58:39 crc kubenswrapper[4776]: I0128 07:58:39.318018 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b09add3-701c-4c0e-ac04-0e974a7bec0d" path="/var/lib/kubelet/pods/5b09add3-701c-4c0e-ac04-0e974a7bec0d/volumes" Jan 28 07:58:42 crc kubenswrapper[4776]: I0128 07:58:42.304340 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:58:42 crc kubenswrapper[4776]: E0128 07:58:42.305087 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.304392 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:58:55 crc kubenswrapper[4776]: E0128 07:58:55.305274 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.648535 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5g96z"] Jan 28 07:58:55 crc kubenswrapper[4776]: E0128 07:58:55.649027 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b09add3-701c-4c0e-ac04-0e974a7bec0d" containerName="registry-server" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.649048 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b09add3-701c-4c0e-ac04-0e974a7bec0d" containerName="registry-server" Jan 28 07:58:55 crc kubenswrapper[4776]: E0128 07:58:55.649068 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a28da1a7-88a8-40d7-86fc-ab285f6d82ee" containerName="extract-utilities" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.649077 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a28da1a7-88a8-40d7-86fc-ab285f6d82ee" containerName="extract-utilities" Jan 28 07:58:55 crc kubenswrapper[4776]: E0128 07:58:55.649098 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a28da1a7-88a8-40d7-86fc-ab285f6d82ee" containerName="extract-content" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.649108 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a28da1a7-88a8-40d7-86fc-ab285f6d82ee" containerName="extract-content" Jan 28 07:58:55 crc kubenswrapper[4776]: E0128 07:58:55.649130 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b09add3-701c-4c0e-ac04-0e974a7bec0d" containerName="extract-utilities" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.649140 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b09add3-701c-4c0e-ac04-0e974a7bec0d" containerName="extract-utilities" Jan 28 07:58:55 crc kubenswrapper[4776]: E0128 07:58:55.649153 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b09add3-701c-4c0e-ac04-0e974a7bec0d" containerName="extract-content" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.649161 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b09add3-701c-4c0e-ac04-0e974a7bec0d" containerName="extract-content" Jan 28 07:58:55 crc kubenswrapper[4776]: E0128 07:58:55.649183 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a28da1a7-88a8-40d7-86fc-ab285f6d82ee" containerName="registry-server" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.649193 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a28da1a7-88a8-40d7-86fc-ab285f6d82ee" containerName="registry-server" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.649445 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b09add3-701c-4c0e-ac04-0e974a7bec0d" containerName="registry-server" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.649483 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a28da1a7-88a8-40d7-86fc-ab285f6d82ee" containerName="registry-server" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.652832 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5g96z" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.659797 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5g96z"] Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.764039 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37db7c1f-4375-4bc6-970d-4e5e50cb968a-utilities\") pod \"redhat-operators-5g96z\" (UID: \"37db7c1f-4375-4bc6-970d-4e5e50cb968a\") " pod="openshift-marketplace/redhat-operators-5g96z" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.764108 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37db7c1f-4375-4bc6-970d-4e5e50cb968a-catalog-content\") pod \"redhat-operators-5g96z\" (UID: \"37db7c1f-4375-4bc6-970d-4e5e50cb968a\") " pod="openshift-marketplace/redhat-operators-5g96z" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.764208 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4n5l\" (UniqueName: \"kubernetes.io/projected/37db7c1f-4375-4bc6-970d-4e5e50cb968a-kube-api-access-n4n5l\") pod \"redhat-operators-5g96z\" (UID: \"37db7c1f-4375-4bc6-970d-4e5e50cb968a\") " pod="openshift-marketplace/redhat-operators-5g96z" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.866240 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4n5l\" (UniqueName: \"kubernetes.io/projected/37db7c1f-4375-4bc6-970d-4e5e50cb968a-kube-api-access-n4n5l\") pod \"redhat-operators-5g96z\" (UID: \"37db7c1f-4375-4bc6-970d-4e5e50cb968a\") " pod="openshift-marketplace/redhat-operators-5g96z" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.866386 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37db7c1f-4375-4bc6-970d-4e5e50cb968a-utilities\") pod \"redhat-operators-5g96z\" (UID: \"37db7c1f-4375-4bc6-970d-4e5e50cb968a\") " pod="openshift-marketplace/redhat-operators-5g96z" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.866431 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37db7c1f-4375-4bc6-970d-4e5e50cb968a-catalog-content\") pod \"redhat-operators-5g96z\" (UID: \"37db7c1f-4375-4bc6-970d-4e5e50cb968a\") " pod="openshift-marketplace/redhat-operators-5g96z" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.866860 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37db7c1f-4375-4bc6-970d-4e5e50cb968a-catalog-content\") pod \"redhat-operators-5g96z\" (UID: \"37db7c1f-4375-4bc6-970d-4e5e50cb968a\") " pod="openshift-marketplace/redhat-operators-5g96z" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.866966 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37db7c1f-4375-4bc6-970d-4e5e50cb968a-utilities\") pod \"redhat-operators-5g96z\" (UID: \"37db7c1f-4375-4bc6-970d-4e5e50cb968a\") " pod="openshift-marketplace/redhat-operators-5g96z" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.883742 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4n5l\" (UniqueName: \"kubernetes.io/projected/37db7c1f-4375-4bc6-970d-4e5e50cb968a-kube-api-access-n4n5l\") pod \"redhat-operators-5g96z\" (UID: \"37db7c1f-4375-4bc6-970d-4e5e50cb968a\") " pod="openshift-marketplace/redhat-operators-5g96z" Jan 28 07:58:55 crc kubenswrapper[4776]: I0128 07:58:55.975583 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5g96z" Jan 28 07:58:56 crc kubenswrapper[4776]: I0128 07:58:56.437443 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5g96z"] Jan 28 07:58:56 crc kubenswrapper[4776]: I0128 07:58:56.621207 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g96z" event={"ID":"37db7c1f-4375-4bc6-970d-4e5e50cb968a","Type":"ContainerStarted","Data":"f450e2b2c422d546fa08dbc4c7b3ad8793d6237aa186bc9db00d7d8f8df0a885"} Jan 28 07:58:57 crc kubenswrapper[4776]: I0128 07:58:57.635877 4776 generic.go:334] "Generic (PLEG): container finished" podID="37db7c1f-4375-4bc6-970d-4e5e50cb968a" containerID="a55df1fb310c113efe058d595e4832c99c46f230daf6c1bb3f22381c18cf0f46" exitCode=0 Jan 28 07:58:57 crc kubenswrapper[4776]: I0128 07:58:57.636254 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g96z" event={"ID":"37db7c1f-4375-4bc6-970d-4e5e50cb968a","Type":"ContainerDied","Data":"a55df1fb310c113efe058d595e4832c99c46f230daf6c1bb3f22381c18cf0f46"} Jan 28 07:58:58 crc kubenswrapper[4776]: I0128 07:58:58.651769 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g96z" event={"ID":"37db7c1f-4375-4bc6-970d-4e5e50cb968a","Type":"ContainerStarted","Data":"0f4796a449603d007b7dd8a1b9a243d2a6ec992f989f5aa3c42b8394b342550d"} Jan 28 07:59:03 crc kubenswrapper[4776]: I0128 07:59:03.707227 4776 generic.go:334] "Generic (PLEG): container finished" podID="37db7c1f-4375-4bc6-970d-4e5e50cb968a" containerID="0f4796a449603d007b7dd8a1b9a243d2a6ec992f989f5aa3c42b8394b342550d" exitCode=0 Jan 28 07:59:03 crc kubenswrapper[4776]: I0128 07:59:03.707286 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g96z" event={"ID":"37db7c1f-4375-4bc6-970d-4e5e50cb968a","Type":"ContainerDied","Data":"0f4796a449603d007b7dd8a1b9a243d2a6ec992f989f5aa3c42b8394b342550d"} Jan 28 07:59:04 crc kubenswrapper[4776]: I0128 07:59:04.719922 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g96z" event={"ID":"37db7c1f-4375-4bc6-970d-4e5e50cb968a","Type":"ContainerStarted","Data":"e4a54478fd36f2c72f3b320252689f562b2b47e92d49a362a4c66c08825dff44"} Jan 28 07:59:05 crc kubenswrapper[4776]: I0128 07:59:05.101406 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5g96z" podStartSLOduration=3.548610768 podStartE2EDuration="10.101371103s" podCreationTimestamp="2026-01-28 07:58:55 +0000 UTC" firstStartedPulling="2026-01-28 07:58:57.638702929 +0000 UTC m=+4109.054363109" lastFinishedPulling="2026-01-28 07:59:04.191463274 +0000 UTC m=+4115.607123444" observedRunningTime="2026-01-28 07:59:05.089117042 +0000 UTC m=+4116.504777212" watchObservedRunningTime="2026-01-28 07:59:05.101371103 +0000 UTC m=+4116.517031303" Jan 28 07:59:05 crc kubenswrapper[4776]: I0128 07:59:05.977582 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5g96z" Jan 28 07:59:05 crc kubenswrapper[4776]: I0128 07:59:05.977889 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5g96z" Jan 28 07:59:06 crc kubenswrapper[4776]: I0128 07:59:06.305877 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:59:06 crc kubenswrapper[4776]: E0128 07:59:06.306342 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:59:07 crc kubenswrapper[4776]: I0128 07:59:07.051078 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5g96z" podUID="37db7c1f-4375-4bc6-970d-4e5e50cb968a" containerName="registry-server" probeResult="failure" output=< Jan 28 07:59:07 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Jan 28 07:59:07 crc kubenswrapper[4776]: > Jan 28 07:59:16 crc kubenswrapper[4776]: I0128 07:59:16.074915 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5g96z" Jan 28 07:59:16 crc kubenswrapper[4776]: I0128 07:59:16.149789 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5g96z" Jan 28 07:59:16 crc kubenswrapper[4776]: I0128 07:59:16.323262 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5g96z"] Jan 28 07:59:17 crc kubenswrapper[4776]: I0128 07:59:17.855739 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5g96z" podUID="37db7c1f-4375-4bc6-970d-4e5e50cb968a" containerName="registry-server" containerID="cri-o://e4a54478fd36f2c72f3b320252689f562b2b47e92d49a362a4c66c08825dff44" gracePeriod=2 Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.488445 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5g96z" Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.639527 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37db7c1f-4375-4bc6-970d-4e5e50cb968a-utilities\") pod \"37db7c1f-4375-4bc6-970d-4e5e50cb968a\" (UID: \"37db7c1f-4375-4bc6-970d-4e5e50cb968a\") " Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.639756 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37db7c1f-4375-4bc6-970d-4e5e50cb968a-catalog-content\") pod \"37db7c1f-4375-4bc6-970d-4e5e50cb968a\" (UID: \"37db7c1f-4375-4bc6-970d-4e5e50cb968a\") " Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.639779 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4n5l\" (UniqueName: \"kubernetes.io/projected/37db7c1f-4375-4bc6-970d-4e5e50cb968a-kube-api-access-n4n5l\") pod \"37db7c1f-4375-4bc6-970d-4e5e50cb968a\" (UID: \"37db7c1f-4375-4bc6-970d-4e5e50cb968a\") " Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.641030 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37db7c1f-4375-4bc6-970d-4e5e50cb968a-utilities" (OuterVolumeSpecName: "utilities") pod "37db7c1f-4375-4bc6-970d-4e5e50cb968a" (UID: "37db7c1f-4375-4bc6-970d-4e5e50cb968a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.645128 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37db7c1f-4375-4bc6-970d-4e5e50cb968a-kube-api-access-n4n5l" (OuterVolumeSpecName: "kube-api-access-n4n5l") pod "37db7c1f-4375-4bc6-970d-4e5e50cb968a" (UID: "37db7c1f-4375-4bc6-970d-4e5e50cb968a"). InnerVolumeSpecName "kube-api-access-n4n5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.742511 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37db7c1f-4375-4bc6-970d-4e5e50cb968a-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.742565 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4n5l\" (UniqueName: \"kubernetes.io/projected/37db7c1f-4375-4bc6-970d-4e5e50cb968a-kube-api-access-n4n5l\") on node \"crc\" DevicePath \"\"" Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.756816 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37db7c1f-4375-4bc6-970d-4e5e50cb968a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37db7c1f-4375-4bc6-970d-4e5e50cb968a" (UID: "37db7c1f-4375-4bc6-970d-4e5e50cb968a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.844798 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37db7c1f-4375-4bc6-970d-4e5e50cb968a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.868917 4776 generic.go:334] "Generic (PLEG): container finished" podID="37db7c1f-4375-4bc6-970d-4e5e50cb968a" containerID="e4a54478fd36f2c72f3b320252689f562b2b47e92d49a362a4c66c08825dff44" exitCode=0 Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.868989 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g96z" event={"ID":"37db7c1f-4375-4bc6-970d-4e5e50cb968a","Type":"ContainerDied","Data":"e4a54478fd36f2c72f3b320252689f562b2b47e92d49a362a4c66c08825dff44"} Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.869006 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5g96z" Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.869052 4776 scope.go:117] "RemoveContainer" containerID="e4a54478fd36f2c72f3b320252689f562b2b47e92d49a362a4c66c08825dff44" Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.869040 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g96z" event={"ID":"37db7c1f-4375-4bc6-970d-4e5e50cb968a","Type":"ContainerDied","Data":"f450e2b2c422d546fa08dbc4c7b3ad8793d6237aa186bc9db00d7d8f8df0a885"} Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.908689 4776 scope.go:117] "RemoveContainer" containerID="0f4796a449603d007b7dd8a1b9a243d2a6ec992f989f5aa3c42b8394b342550d" Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.914246 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5g96z"] Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.923206 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5g96z"] Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.933756 4776 scope.go:117] "RemoveContainer" containerID="a55df1fb310c113efe058d595e4832c99c46f230daf6c1bb3f22381c18cf0f46" Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.985945 4776 scope.go:117] "RemoveContainer" containerID="e4a54478fd36f2c72f3b320252689f562b2b47e92d49a362a4c66c08825dff44" Jan 28 07:59:18 crc kubenswrapper[4776]: E0128 07:59:18.986310 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4a54478fd36f2c72f3b320252689f562b2b47e92d49a362a4c66c08825dff44\": container with ID starting with e4a54478fd36f2c72f3b320252689f562b2b47e92d49a362a4c66c08825dff44 not found: ID does not exist" containerID="e4a54478fd36f2c72f3b320252689f562b2b47e92d49a362a4c66c08825dff44" Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.986351 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a54478fd36f2c72f3b320252689f562b2b47e92d49a362a4c66c08825dff44"} err="failed to get container status \"e4a54478fd36f2c72f3b320252689f562b2b47e92d49a362a4c66c08825dff44\": rpc error: code = NotFound desc = could not find container \"e4a54478fd36f2c72f3b320252689f562b2b47e92d49a362a4c66c08825dff44\": container with ID starting with e4a54478fd36f2c72f3b320252689f562b2b47e92d49a362a4c66c08825dff44 not found: ID does not exist" Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.986377 4776 scope.go:117] "RemoveContainer" containerID="0f4796a449603d007b7dd8a1b9a243d2a6ec992f989f5aa3c42b8394b342550d" Jan 28 07:59:18 crc kubenswrapper[4776]: E0128 07:59:18.986672 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f4796a449603d007b7dd8a1b9a243d2a6ec992f989f5aa3c42b8394b342550d\": container with ID starting with 0f4796a449603d007b7dd8a1b9a243d2a6ec992f989f5aa3c42b8394b342550d not found: ID does not exist" containerID="0f4796a449603d007b7dd8a1b9a243d2a6ec992f989f5aa3c42b8394b342550d" Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.986715 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4796a449603d007b7dd8a1b9a243d2a6ec992f989f5aa3c42b8394b342550d"} err="failed to get container status \"0f4796a449603d007b7dd8a1b9a243d2a6ec992f989f5aa3c42b8394b342550d\": rpc error: code = NotFound desc = could not find container \"0f4796a449603d007b7dd8a1b9a243d2a6ec992f989f5aa3c42b8394b342550d\": container with ID starting with 0f4796a449603d007b7dd8a1b9a243d2a6ec992f989f5aa3c42b8394b342550d not found: ID does not exist" Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.986741 4776 scope.go:117] "RemoveContainer" containerID="a55df1fb310c113efe058d595e4832c99c46f230daf6c1bb3f22381c18cf0f46" Jan 28 07:59:18 crc kubenswrapper[4776]: E0128 07:59:18.987078 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a55df1fb310c113efe058d595e4832c99c46f230daf6c1bb3f22381c18cf0f46\": container with ID starting with a55df1fb310c113efe058d595e4832c99c46f230daf6c1bb3f22381c18cf0f46 not found: ID does not exist" containerID="a55df1fb310c113efe058d595e4832c99c46f230daf6c1bb3f22381c18cf0f46" Jan 28 07:59:18 crc kubenswrapper[4776]: I0128 07:59:18.987102 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a55df1fb310c113efe058d595e4832c99c46f230daf6c1bb3f22381c18cf0f46"} err="failed to get container status \"a55df1fb310c113efe058d595e4832c99c46f230daf6c1bb3f22381c18cf0f46\": rpc error: code = NotFound desc = could not find container \"a55df1fb310c113efe058d595e4832c99c46f230daf6c1bb3f22381c18cf0f46\": container with ID starting with a55df1fb310c113efe058d595e4832c99c46f230daf6c1bb3f22381c18cf0f46 not found: ID does not exist" Jan 28 07:59:19 crc kubenswrapper[4776]: I0128 07:59:19.315026 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37db7c1f-4375-4bc6-970d-4e5e50cb968a" path="/var/lib/kubelet/pods/37db7c1f-4375-4bc6-970d-4e5e50cb968a/volumes" Jan 28 07:59:20 crc kubenswrapper[4776]: I0128 07:59:20.304290 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:59:20 crc kubenswrapper[4776]: E0128 07:59:20.304906 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:59:32 crc kubenswrapper[4776]: I0128 07:59:32.306206 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:59:32 crc kubenswrapper[4776]: E0128 07:59:32.307091 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 07:59:46 crc kubenswrapper[4776]: I0128 07:59:46.305694 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 07:59:47 crc kubenswrapper[4776]: I0128 07:59:47.211685 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"73ee3a896c8189ebaffaf431eaef6bac020496d7e43e90b69160f650f60dceac"} Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.187260 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc"] Jan 28 08:00:00 crc kubenswrapper[4776]: E0128 08:00:00.188147 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37db7c1f-4375-4bc6-970d-4e5e50cb968a" containerName="extract-utilities" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.188162 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="37db7c1f-4375-4bc6-970d-4e5e50cb968a" containerName="extract-utilities" Jan 28 08:00:00 crc kubenswrapper[4776]: E0128 08:00:00.188173 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37db7c1f-4375-4bc6-970d-4e5e50cb968a" containerName="extract-content" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.188178 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="37db7c1f-4375-4bc6-970d-4e5e50cb968a" containerName="extract-content" Jan 28 08:00:00 crc kubenswrapper[4776]: E0128 08:00:00.188218 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37db7c1f-4375-4bc6-970d-4e5e50cb968a" containerName="registry-server" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.188227 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="37db7c1f-4375-4bc6-970d-4e5e50cb968a" containerName="registry-server" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.188442 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="37db7c1f-4375-4bc6-970d-4e5e50cb968a" containerName="registry-server" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.189091 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.191337 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.192194 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.226351 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc"] Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.230268 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/633dee55-5519-4393-afe0-baa3a3ea76e6-config-volume\") pod \"collect-profiles-29493120-n64gc\" (UID: \"633dee55-5519-4393-afe0-baa3a3ea76e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.230331 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/633dee55-5519-4393-afe0-baa3a3ea76e6-secret-volume\") pod \"collect-profiles-29493120-n64gc\" (UID: \"633dee55-5519-4393-afe0-baa3a3ea76e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.230445 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xhmv\" (UniqueName: \"kubernetes.io/projected/633dee55-5519-4393-afe0-baa3a3ea76e6-kube-api-access-6xhmv\") pod \"collect-profiles-29493120-n64gc\" (UID: \"633dee55-5519-4393-afe0-baa3a3ea76e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.332231 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/633dee55-5519-4393-afe0-baa3a3ea76e6-config-volume\") pod \"collect-profiles-29493120-n64gc\" (UID: \"633dee55-5519-4393-afe0-baa3a3ea76e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.332313 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/633dee55-5519-4393-afe0-baa3a3ea76e6-secret-volume\") pod \"collect-profiles-29493120-n64gc\" (UID: \"633dee55-5519-4393-afe0-baa3a3ea76e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.332479 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xhmv\" (UniqueName: \"kubernetes.io/projected/633dee55-5519-4393-afe0-baa3a3ea76e6-kube-api-access-6xhmv\") pod \"collect-profiles-29493120-n64gc\" (UID: \"633dee55-5519-4393-afe0-baa3a3ea76e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.333595 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/633dee55-5519-4393-afe0-baa3a3ea76e6-config-volume\") pod \"collect-profiles-29493120-n64gc\" (UID: \"633dee55-5519-4393-afe0-baa3a3ea76e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.339810 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/633dee55-5519-4393-afe0-baa3a3ea76e6-secret-volume\") pod \"collect-profiles-29493120-n64gc\" (UID: \"633dee55-5519-4393-afe0-baa3a3ea76e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.350558 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xhmv\" (UniqueName: \"kubernetes.io/projected/633dee55-5519-4393-afe0-baa3a3ea76e6-kube-api-access-6xhmv\") pod \"collect-profiles-29493120-n64gc\" (UID: \"633dee55-5519-4393-afe0-baa3a3ea76e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.513764 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" Jan 28 08:00:00 crc kubenswrapper[4776]: I0128 08:00:00.964914 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc"] Jan 28 08:00:00 crc kubenswrapper[4776]: W0128 08:00:00.970364 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod633dee55_5519_4393_afe0_baa3a3ea76e6.slice/crio-a3e436f46bd76770938474d6c306353dd8c0365d22243bff19c75bcee52c9ce0 WatchSource:0}: Error finding container a3e436f46bd76770938474d6c306353dd8c0365d22243bff19c75bcee52c9ce0: Status 404 returned error can't find the container with id a3e436f46bd76770938474d6c306353dd8c0365d22243bff19c75bcee52c9ce0 Jan 28 08:00:01 crc kubenswrapper[4776]: I0128 08:00:01.359467 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" event={"ID":"633dee55-5519-4393-afe0-baa3a3ea76e6","Type":"ContainerStarted","Data":"cda085ad24768e365e0210dc3f1fdb45f7553e09d2afd162c6034893b99f9fbe"} Jan 28 08:00:01 crc kubenswrapper[4776]: I0128 08:00:01.359928 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" event={"ID":"633dee55-5519-4393-afe0-baa3a3ea76e6","Type":"ContainerStarted","Data":"a3e436f46bd76770938474d6c306353dd8c0365d22243bff19c75bcee52c9ce0"} Jan 28 08:00:01 crc kubenswrapper[4776]: I0128 08:00:01.380428 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" podStartSLOduration=1.380408654 podStartE2EDuration="1.380408654s" podCreationTimestamp="2026-01-28 08:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 08:00:01.378028269 +0000 UTC m=+4172.793688469" watchObservedRunningTime="2026-01-28 08:00:01.380408654 +0000 UTC m=+4172.796068814" Jan 28 08:00:02 crc kubenswrapper[4776]: I0128 08:00:02.369944 4776 generic.go:334] "Generic (PLEG): container finished" podID="633dee55-5519-4393-afe0-baa3a3ea76e6" containerID="cda085ad24768e365e0210dc3f1fdb45f7553e09d2afd162c6034893b99f9fbe" exitCode=0 Jan 28 08:00:02 crc kubenswrapper[4776]: I0128 08:00:02.370641 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" event={"ID":"633dee55-5519-4393-afe0-baa3a3ea76e6","Type":"ContainerDied","Data":"cda085ad24768e365e0210dc3f1fdb45f7553e09d2afd162c6034893b99f9fbe"} Jan 28 08:00:04 crc kubenswrapper[4776]: I0128 08:00:04.350332 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" Jan 28 08:00:04 crc kubenswrapper[4776]: I0128 08:00:04.391696 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" event={"ID":"633dee55-5519-4393-afe0-baa3a3ea76e6","Type":"ContainerDied","Data":"a3e436f46bd76770938474d6c306353dd8c0365d22243bff19c75bcee52c9ce0"} Jan 28 08:00:04 crc kubenswrapper[4776]: I0128 08:00:04.391737 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493120-n64gc" Jan 28 08:00:04 crc kubenswrapper[4776]: I0128 08:00:04.391748 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3e436f46bd76770938474d6c306353dd8c0365d22243bff19c75bcee52c9ce0" Jan 28 08:00:04 crc kubenswrapper[4776]: I0128 08:00:04.437195 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/633dee55-5519-4393-afe0-baa3a3ea76e6-secret-volume\") pod \"633dee55-5519-4393-afe0-baa3a3ea76e6\" (UID: \"633dee55-5519-4393-afe0-baa3a3ea76e6\") " Jan 28 08:00:04 crc kubenswrapper[4776]: I0128 08:00:04.437416 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/633dee55-5519-4393-afe0-baa3a3ea76e6-config-volume\") pod \"633dee55-5519-4393-afe0-baa3a3ea76e6\" (UID: \"633dee55-5519-4393-afe0-baa3a3ea76e6\") " Jan 28 08:00:04 crc kubenswrapper[4776]: I0128 08:00:04.437565 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xhmv\" (UniqueName: \"kubernetes.io/projected/633dee55-5519-4393-afe0-baa3a3ea76e6-kube-api-access-6xhmv\") pod \"633dee55-5519-4393-afe0-baa3a3ea76e6\" (UID: \"633dee55-5519-4393-afe0-baa3a3ea76e6\") " Jan 28 08:00:04 crc kubenswrapper[4776]: I0128 08:00:04.438035 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/633dee55-5519-4393-afe0-baa3a3ea76e6-config-volume" (OuterVolumeSpecName: "config-volume") pod "633dee55-5519-4393-afe0-baa3a3ea76e6" (UID: "633dee55-5519-4393-afe0-baa3a3ea76e6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 08:00:04 crc kubenswrapper[4776]: I0128 08:00:04.445783 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/633dee55-5519-4393-afe0-baa3a3ea76e6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "633dee55-5519-4393-afe0-baa3a3ea76e6" (UID: "633dee55-5519-4393-afe0-baa3a3ea76e6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 08:00:04 crc kubenswrapper[4776]: I0128 08:00:04.445855 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/633dee55-5519-4393-afe0-baa3a3ea76e6-kube-api-access-6xhmv" (OuterVolumeSpecName: "kube-api-access-6xhmv") pod "633dee55-5519-4393-afe0-baa3a3ea76e6" (UID: "633dee55-5519-4393-afe0-baa3a3ea76e6"). InnerVolumeSpecName "kube-api-access-6xhmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:00:04 crc kubenswrapper[4776]: I0128 08:00:04.447468 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xhmv\" (UniqueName: \"kubernetes.io/projected/633dee55-5519-4393-afe0-baa3a3ea76e6-kube-api-access-6xhmv\") on node \"crc\" DevicePath \"\"" Jan 28 08:00:04 crc kubenswrapper[4776]: I0128 08:00:04.447494 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/633dee55-5519-4393-afe0-baa3a3ea76e6-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 08:00:04 crc kubenswrapper[4776]: I0128 08:00:04.447510 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/633dee55-5519-4393-afe0-baa3a3ea76e6-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 08:00:04 crc kubenswrapper[4776]: I0128 08:00:04.454039 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5"] Jan 28 08:00:04 crc kubenswrapper[4776]: I0128 08:00:04.462055 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493075-59vj5"] Jan 28 08:00:05 crc kubenswrapper[4776]: I0128 08:00:05.318459 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac762e3d-99f1-442c-be4c-9b31c622a055" path="/var/lib/kubelet/pods/ac762e3d-99f1-442c-be4c-9b31c622a055/volumes" Jan 28 08:00:49 crc kubenswrapper[4776]: I0128 08:00:49.823967 4776 scope.go:117] "RemoveContainer" containerID="48790fb9c097dcd680e80dc39c870ebb81ebec4eca6685eedfd1b9cde58445c0" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.155514 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29493121-7fxmq"] Jan 28 08:01:00 crc kubenswrapper[4776]: E0128 08:01:00.156532 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="633dee55-5519-4393-afe0-baa3a3ea76e6" containerName="collect-profiles" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.156577 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="633dee55-5519-4393-afe0-baa3a3ea76e6" containerName="collect-profiles" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.156905 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="633dee55-5519-4393-afe0-baa3a3ea76e6" containerName="collect-profiles" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.157878 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29493121-7fxmq" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.172056 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29493121-7fxmq"] Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.284878 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-config-data\") pod \"keystone-cron-29493121-7fxmq\" (UID: \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\") " pod="openstack/keystone-cron-29493121-7fxmq" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.284989 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-fernet-keys\") pod \"keystone-cron-29493121-7fxmq\" (UID: \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\") " pod="openstack/keystone-cron-29493121-7fxmq" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.285078 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnfpd\" (UniqueName: \"kubernetes.io/projected/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-kube-api-access-dnfpd\") pod \"keystone-cron-29493121-7fxmq\" (UID: \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\") " pod="openstack/keystone-cron-29493121-7fxmq" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.285109 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-combined-ca-bundle\") pod \"keystone-cron-29493121-7fxmq\" (UID: \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\") " pod="openstack/keystone-cron-29493121-7fxmq" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.388022 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-config-data\") pod \"keystone-cron-29493121-7fxmq\" (UID: \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\") " pod="openstack/keystone-cron-29493121-7fxmq" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.388254 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-fernet-keys\") pod \"keystone-cron-29493121-7fxmq\" (UID: \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\") " pod="openstack/keystone-cron-29493121-7fxmq" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.388518 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnfpd\" (UniqueName: \"kubernetes.io/projected/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-kube-api-access-dnfpd\") pod \"keystone-cron-29493121-7fxmq\" (UID: \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\") " pod="openstack/keystone-cron-29493121-7fxmq" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.388627 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-combined-ca-bundle\") pod \"keystone-cron-29493121-7fxmq\" (UID: \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\") " pod="openstack/keystone-cron-29493121-7fxmq" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.394677 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-combined-ca-bundle\") pod \"keystone-cron-29493121-7fxmq\" (UID: \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\") " pod="openstack/keystone-cron-29493121-7fxmq" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.394820 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-fernet-keys\") pod \"keystone-cron-29493121-7fxmq\" (UID: \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\") " pod="openstack/keystone-cron-29493121-7fxmq" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.395640 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-config-data\") pod \"keystone-cron-29493121-7fxmq\" (UID: \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\") " pod="openstack/keystone-cron-29493121-7fxmq" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.412199 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnfpd\" (UniqueName: \"kubernetes.io/projected/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-kube-api-access-dnfpd\") pod \"keystone-cron-29493121-7fxmq\" (UID: \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\") " pod="openstack/keystone-cron-29493121-7fxmq" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.476591 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29493121-7fxmq" Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.931770 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29493121-7fxmq"] Jan 28 08:01:00 crc kubenswrapper[4776]: I0128 08:01:00.973706 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29493121-7fxmq" event={"ID":"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d","Type":"ContainerStarted","Data":"ac14bce6fb48535d0872235e9b7c75f2ee5313db4a26f762a15033baced99abd"} Jan 28 08:01:02 crc kubenswrapper[4776]: I0128 08:01:02.001575 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29493121-7fxmq" event={"ID":"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d","Type":"ContainerStarted","Data":"1e5758a561f7530c1f5516da1a82687165019d098fe9086f9b7cbccaf976946e"} Jan 28 08:01:02 crc kubenswrapper[4776]: I0128 08:01:02.022610 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29493121-7fxmq" podStartSLOduration=2.02259356 podStartE2EDuration="2.02259356s" podCreationTimestamp="2026-01-28 08:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 08:01:02.019704152 +0000 UTC m=+4233.435364312" watchObservedRunningTime="2026-01-28 08:01:02.02259356 +0000 UTC m=+4233.438253720" Jan 28 08:01:04 crc kubenswrapper[4776]: I0128 08:01:04.025516 4776 generic.go:334] "Generic (PLEG): container finished" podID="bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d" containerID="1e5758a561f7530c1f5516da1a82687165019d098fe9086f9b7cbccaf976946e" exitCode=0 Jan 28 08:01:04 crc kubenswrapper[4776]: I0128 08:01:04.025604 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29493121-7fxmq" event={"ID":"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d","Type":"ContainerDied","Data":"1e5758a561f7530c1f5516da1a82687165019d098fe9086f9b7cbccaf976946e"} Jan 28 08:01:05 crc kubenswrapper[4776]: I0128 08:01:05.495627 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29493121-7fxmq" Jan 28 08:01:05 crc kubenswrapper[4776]: I0128 08:01:05.603815 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-config-data\") pod \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\" (UID: \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\") " Jan 28 08:01:05 crc kubenswrapper[4776]: I0128 08:01:05.603971 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-fernet-keys\") pod \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\" (UID: \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\") " Jan 28 08:01:05 crc kubenswrapper[4776]: I0128 08:01:05.604036 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnfpd\" (UniqueName: \"kubernetes.io/projected/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-kube-api-access-dnfpd\") pod \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\" (UID: \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\") " Jan 28 08:01:05 crc kubenswrapper[4776]: I0128 08:01:05.604127 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-combined-ca-bundle\") pod \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\" (UID: \"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d\") " Jan 28 08:01:05 crc kubenswrapper[4776]: I0128 08:01:05.609627 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-kube-api-access-dnfpd" (OuterVolumeSpecName: "kube-api-access-dnfpd") pod "bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d" (UID: "bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d"). InnerVolumeSpecName "kube-api-access-dnfpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:01:05 crc kubenswrapper[4776]: I0128 08:01:05.609846 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d" (UID: "bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 08:01:05 crc kubenswrapper[4776]: I0128 08:01:05.634976 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d" (UID: "bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 08:01:05 crc kubenswrapper[4776]: I0128 08:01:05.667507 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-config-data" (OuterVolumeSpecName: "config-data") pod "bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d" (UID: "bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 08:01:05 crc kubenswrapper[4776]: I0128 08:01:05.706529 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 08:01:05 crc kubenswrapper[4776]: I0128 08:01:05.706579 4776 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 28 08:01:05 crc kubenswrapper[4776]: I0128 08:01:05.706589 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnfpd\" (UniqueName: \"kubernetes.io/projected/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-kube-api-access-dnfpd\") on node \"crc\" DevicePath \"\"" Jan 28 08:01:05 crc kubenswrapper[4776]: I0128 08:01:05.706600 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 08:01:06 crc kubenswrapper[4776]: I0128 08:01:06.050494 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29493121-7fxmq" event={"ID":"bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d","Type":"ContainerDied","Data":"ac14bce6fb48535d0872235e9b7c75f2ee5313db4a26f762a15033baced99abd"} Jan 28 08:01:06 crc kubenswrapper[4776]: I0128 08:01:06.050775 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac14bce6fb48535d0872235e9b7c75f2ee5313db4a26f762a15033baced99abd" Jan 28 08:01:06 crc kubenswrapper[4776]: I0128 08:01:06.050571 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29493121-7fxmq" Jan 28 08:02:03 crc kubenswrapper[4776]: I0128 08:02:03.852238 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:02:03 crc kubenswrapper[4776]: I0128 08:02:03.852959 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:02:33 crc kubenswrapper[4776]: I0128 08:02:33.852852 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:02:33 crc kubenswrapper[4776]: I0128 08:02:33.853760 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:03:03 crc kubenswrapper[4776]: I0128 08:03:03.852144 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:03:03 crc kubenswrapper[4776]: I0128 08:03:03.852879 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:03:03 crc kubenswrapper[4776]: I0128 08:03:03.852937 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 08:03:03 crc kubenswrapper[4776]: I0128 08:03:03.853912 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"73ee3a896c8189ebaffaf431eaef6bac020496d7e43e90b69160f650f60dceac"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 08:03:03 crc kubenswrapper[4776]: I0128 08:03:03.853973 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://73ee3a896c8189ebaffaf431eaef6bac020496d7e43e90b69160f650f60dceac" gracePeriod=600 Jan 28 08:03:04 crc kubenswrapper[4776]: I0128 08:03:04.253780 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="73ee3a896c8189ebaffaf431eaef6bac020496d7e43e90b69160f650f60dceac" exitCode=0 Jan 28 08:03:04 crc kubenswrapper[4776]: I0128 08:03:04.254129 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"73ee3a896c8189ebaffaf431eaef6bac020496d7e43e90b69160f650f60dceac"} Jan 28 08:03:04 crc kubenswrapper[4776]: I0128 08:03:04.254168 4776 scope.go:117] "RemoveContainer" containerID="6974404edd7d77296d078a3acf4ca7bf48a5d7f83972a6e4e58c7f3b06838a12" Jan 28 08:03:05 crc kubenswrapper[4776]: I0128 08:03:05.266023 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a"} Jan 28 08:03:34 crc kubenswrapper[4776]: I0128 08:03:34.025258 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dlbzb"] Jan 28 08:03:34 crc kubenswrapper[4776]: E0128 08:03:34.027461 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d" containerName="keystone-cron" Jan 28 08:03:34 crc kubenswrapper[4776]: I0128 08:03:34.027600 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d" containerName="keystone-cron" Jan 28 08:03:34 crc kubenswrapper[4776]: I0128 08:03:34.027987 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d" containerName="keystone-cron" Jan 28 08:03:34 crc kubenswrapper[4776]: I0128 08:03:34.029918 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dlbzb" Jan 28 08:03:34 crc kubenswrapper[4776]: I0128 08:03:34.048778 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlbzb"] Jan 28 08:03:34 crc kubenswrapper[4776]: I0128 08:03:34.160049 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312f8070-69fa-4ed5-9a9d-31fa78db4969-utilities\") pod \"redhat-marketplace-dlbzb\" (UID: \"312f8070-69fa-4ed5-9a9d-31fa78db4969\") " pod="openshift-marketplace/redhat-marketplace-dlbzb" Jan 28 08:03:34 crc kubenswrapper[4776]: I0128 08:03:34.160167 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gpw6\" (UniqueName: \"kubernetes.io/projected/312f8070-69fa-4ed5-9a9d-31fa78db4969-kube-api-access-9gpw6\") pod \"redhat-marketplace-dlbzb\" (UID: \"312f8070-69fa-4ed5-9a9d-31fa78db4969\") " pod="openshift-marketplace/redhat-marketplace-dlbzb" Jan 28 08:03:34 crc kubenswrapper[4776]: I0128 08:03:34.160747 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312f8070-69fa-4ed5-9a9d-31fa78db4969-catalog-content\") pod \"redhat-marketplace-dlbzb\" (UID: \"312f8070-69fa-4ed5-9a9d-31fa78db4969\") " pod="openshift-marketplace/redhat-marketplace-dlbzb" Jan 28 08:03:34 crc kubenswrapper[4776]: I0128 08:03:34.263109 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312f8070-69fa-4ed5-9a9d-31fa78db4969-utilities\") pod \"redhat-marketplace-dlbzb\" (UID: \"312f8070-69fa-4ed5-9a9d-31fa78db4969\") " pod="openshift-marketplace/redhat-marketplace-dlbzb" Jan 28 08:03:34 crc kubenswrapper[4776]: I0128 08:03:34.263651 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gpw6\" (UniqueName: \"kubernetes.io/projected/312f8070-69fa-4ed5-9a9d-31fa78db4969-kube-api-access-9gpw6\") pod \"redhat-marketplace-dlbzb\" (UID: \"312f8070-69fa-4ed5-9a9d-31fa78db4969\") " pod="openshift-marketplace/redhat-marketplace-dlbzb" Jan 28 08:03:34 crc kubenswrapper[4776]: I0128 08:03:34.263700 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312f8070-69fa-4ed5-9a9d-31fa78db4969-utilities\") pod \"redhat-marketplace-dlbzb\" (UID: \"312f8070-69fa-4ed5-9a9d-31fa78db4969\") " pod="openshift-marketplace/redhat-marketplace-dlbzb" Jan 28 08:03:34 crc kubenswrapper[4776]: I0128 08:03:34.264235 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312f8070-69fa-4ed5-9a9d-31fa78db4969-catalog-content\") pod \"redhat-marketplace-dlbzb\" (UID: \"312f8070-69fa-4ed5-9a9d-31fa78db4969\") " pod="openshift-marketplace/redhat-marketplace-dlbzb" Jan 28 08:03:34 crc kubenswrapper[4776]: I0128 08:03:34.264703 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312f8070-69fa-4ed5-9a9d-31fa78db4969-catalog-content\") pod \"redhat-marketplace-dlbzb\" (UID: \"312f8070-69fa-4ed5-9a9d-31fa78db4969\") " pod="openshift-marketplace/redhat-marketplace-dlbzb" Jan 28 08:03:34 crc kubenswrapper[4776]: I0128 08:03:34.286499 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gpw6\" (UniqueName: \"kubernetes.io/projected/312f8070-69fa-4ed5-9a9d-31fa78db4969-kube-api-access-9gpw6\") pod \"redhat-marketplace-dlbzb\" (UID: \"312f8070-69fa-4ed5-9a9d-31fa78db4969\") " pod="openshift-marketplace/redhat-marketplace-dlbzb" Jan 28 08:03:34 crc kubenswrapper[4776]: I0128 08:03:34.355925 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dlbzb" Jan 28 08:03:34 crc kubenswrapper[4776]: I0128 08:03:34.869874 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlbzb"] Jan 28 08:03:35 crc kubenswrapper[4776]: I0128 08:03:35.577317 4776 generic.go:334] "Generic (PLEG): container finished" podID="312f8070-69fa-4ed5-9a9d-31fa78db4969" containerID="53376ce57de41d5b750f20dfb191258904e45e454f044ce00a017b9938cec1b6" exitCode=0 Jan 28 08:03:35 crc kubenswrapper[4776]: I0128 08:03:35.577621 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlbzb" event={"ID":"312f8070-69fa-4ed5-9a9d-31fa78db4969","Type":"ContainerDied","Data":"53376ce57de41d5b750f20dfb191258904e45e454f044ce00a017b9938cec1b6"} Jan 28 08:03:35 crc kubenswrapper[4776]: I0128 08:03:35.577647 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlbzb" event={"ID":"312f8070-69fa-4ed5-9a9d-31fa78db4969","Type":"ContainerStarted","Data":"c523352e24a4307c838bcd346b1d0a822a3de7896ba10a65e163bef13cddf67a"} Jan 28 08:03:35 crc kubenswrapper[4776]: I0128 08:03:35.580191 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 08:03:36 crc kubenswrapper[4776]: I0128 08:03:36.591209 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlbzb" event={"ID":"312f8070-69fa-4ed5-9a9d-31fa78db4969","Type":"ContainerStarted","Data":"27a9b0ab1e01b34a4f8474b28f1a257a2486eebfe5e203b35b2d4007826c62c9"} Jan 28 08:03:37 crc kubenswrapper[4776]: I0128 08:03:37.609235 4776 generic.go:334] "Generic (PLEG): container finished" podID="312f8070-69fa-4ed5-9a9d-31fa78db4969" containerID="27a9b0ab1e01b34a4f8474b28f1a257a2486eebfe5e203b35b2d4007826c62c9" exitCode=0 Jan 28 08:03:37 crc kubenswrapper[4776]: I0128 08:03:37.610378 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlbzb" event={"ID":"312f8070-69fa-4ed5-9a9d-31fa78db4969","Type":"ContainerDied","Data":"27a9b0ab1e01b34a4f8474b28f1a257a2486eebfe5e203b35b2d4007826c62c9"} Jan 28 08:03:38 crc kubenswrapper[4776]: I0128 08:03:38.629494 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlbzb" event={"ID":"312f8070-69fa-4ed5-9a9d-31fa78db4969","Type":"ContainerStarted","Data":"2564950837e6c6e0485b322b46dcf81bbb767eec54b8527d8314e2582e5ca20f"} Jan 28 08:03:38 crc kubenswrapper[4776]: I0128 08:03:38.672744 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dlbzb" podStartSLOduration=3.238975563 podStartE2EDuration="5.672723696s" podCreationTimestamp="2026-01-28 08:03:33 +0000 UTC" firstStartedPulling="2026-01-28 08:03:35.579738749 +0000 UTC m=+4386.995398949" lastFinishedPulling="2026-01-28 08:03:38.013486922 +0000 UTC m=+4389.429147082" observedRunningTime="2026-01-28 08:03:38.656437063 +0000 UTC m=+4390.072097253" watchObservedRunningTime="2026-01-28 08:03:38.672723696 +0000 UTC m=+4390.088383866" Jan 28 08:03:44 crc kubenswrapper[4776]: I0128 08:03:44.356760 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dlbzb" Jan 28 08:03:44 crc kubenswrapper[4776]: I0128 08:03:44.357331 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dlbzb" Jan 28 08:03:44 crc kubenswrapper[4776]: I0128 08:03:44.419661 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dlbzb" Jan 28 08:03:44 crc kubenswrapper[4776]: I0128 08:03:44.754961 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dlbzb" Jan 28 08:03:44 crc kubenswrapper[4776]: I0128 08:03:44.832748 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlbzb"] Jan 28 08:03:46 crc kubenswrapper[4776]: I0128 08:03:46.714295 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dlbzb" podUID="312f8070-69fa-4ed5-9a9d-31fa78db4969" containerName="registry-server" containerID="cri-o://2564950837e6c6e0485b322b46dcf81bbb767eec54b8527d8314e2582e5ca20f" gracePeriod=2 Jan 28 08:03:47 crc kubenswrapper[4776]: E0128 08:03:47.029959 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod312f8070_69fa_4ed5_9a9d_31fa78db4969.slice/crio-2564950837e6c6e0485b322b46dcf81bbb767eec54b8527d8314e2582e5ca20f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod312f8070_69fa_4ed5_9a9d_31fa78db4969.slice/crio-conmon-2564950837e6c6e0485b322b46dcf81bbb767eec54b8527d8314e2582e5ca20f.scope\": RecentStats: unable to find data in memory cache]" Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.265478 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dlbzb" Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.377069 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312f8070-69fa-4ed5-9a9d-31fa78db4969-catalog-content\") pod \"312f8070-69fa-4ed5-9a9d-31fa78db4969\" (UID: \"312f8070-69fa-4ed5-9a9d-31fa78db4969\") " Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.377127 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gpw6\" (UniqueName: \"kubernetes.io/projected/312f8070-69fa-4ed5-9a9d-31fa78db4969-kube-api-access-9gpw6\") pod \"312f8070-69fa-4ed5-9a9d-31fa78db4969\" (UID: \"312f8070-69fa-4ed5-9a9d-31fa78db4969\") " Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.377163 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312f8070-69fa-4ed5-9a9d-31fa78db4969-utilities\") pod \"312f8070-69fa-4ed5-9a9d-31fa78db4969\" (UID: \"312f8070-69fa-4ed5-9a9d-31fa78db4969\") " Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.381225 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/312f8070-69fa-4ed5-9a9d-31fa78db4969-utilities" (OuterVolumeSpecName: "utilities") pod "312f8070-69fa-4ed5-9a9d-31fa78db4969" (UID: "312f8070-69fa-4ed5-9a9d-31fa78db4969"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.385893 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/312f8070-69fa-4ed5-9a9d-31fa78db4969-kube-api-access-9gpw6" (OuterVolumeSpecName: "kube-api-access-9gpw6") pod "312f8070-69fa-4ed5-9a9d-31fa78db4969" (UID: "312f8070-69fa-4ed5-9a9d-31fa78db4969"). InnerVolumeSpecName "kube-api-access-9gpw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.405467 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/312f8070-69fa-4ed5-9a9d-31fa78db4969-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "312f8070-69fa-4ed5-9a9d-31fa78db4969" (UID: "312f8070-69fa-4ed5-9a9d-31fa78db4969"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.480421 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312f8070-69fa-4ed5-9a9d-31fa78db4969-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.480454 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gpw6\" (UniqueName: \"kubernetes.io/projected/312f8070-69fa-4ed5-9a9d-31fa78db4969-kube-api-access-9gpw6\") on node \"crc\" DevicePath \"\"" Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.480468 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312f8070-69fa-4ed5-9a9d-31fa78db4969-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.729860 4776 generic.go:334] "Generic (PLEG): container finished" podID="312f8070-69fa-4ed5-9a9d-31fa78db4969" containerID="2564950837e6c6e0485b322b46dcf81bbb767eec54b8527d8314e2582e5ca20f" exitCode=0 Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.729908 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlbzb" event={"ID":"312f8070-69fa-4ed5-9a9d-31fa78db4969","Type":"ContainerDied","Data":"2564950837e6c6e0485b322b46dcf81bbb767eec54b8527d8314e2582e5ca20f"} Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.729934 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlbzb" event={"ID":"312f8070-69fa-4ed5-9a9d-31fa78db4969","Type":"ContainerDied","Data":"c523352e24a4307c838bcd346b1d0a822a3de7896ba10a65e163bef13cddf67a"} Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.729952 4776 scope.go:117] "RemoveContainer" containerID="2564950837e6c6e0485b322b46dcf81bbb767eec54b8527d8314e2582e5ca20f" Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.730079 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dlbzb" Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.784085 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlbzb"] Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.788677 4776 scope.go:117] "RemoveContainer" containerID="27a9b0ab1e01b34a4f8474b28f1a257a2486eebfe5e203b35b2d4007826c62c9" Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.794929 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlbzb"] Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.812792 4776 scope.go:117] "RemoveContainer" containerID="53376ce57de41d5b750f20dfb191258904e45e454f044ce00a017b9938cec1b6" Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.872073 4776 scope.go:117] "RemoveContainer" containerID="2564950837e6c6e0485b322b46dcf81bbb767eec54b8527d8314e2582e5ca20f" Jan 28 08:03:47 crc kubenswrapper[4776]: E0128 08:03:47.872696 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2564950837e6c6e0485b322b46dcf81bbb767eec54b8527d8314e2582e5ca20f\": container with ID starting with 2564950837e6c6e0485b322b46dcf81bbb767eec54b8527d8314e2582e5ca20f not found: ID does not exist" containerID="2564950837e6c6e0485b322b46dcf81bbb767eec54b8527d8314e2582e5ca20f" Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.872743 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2564950837e6c6e0485b322b46dcf81bbb767eec54b8527d8314e2582e5ca20f"} err="failed to get container status \"2564950837e6c6e0485b322b46dcf81bbb767eec54b8527d8314e2582e5ca20f\": rpc error: code = NotFound desc = could not find container \"2564950837e6c6e0485b322b46dcf81bbb767eec54b8527d8314e2582e5ca20f\": container with ID starting with 2564950837e6c6e0485b322b46dcf81bbb767eec54b8527d8314e2582e5ca20f not found: ID does not exist" Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.872769 4776 scope.go:117] "RemoveContainer" containerID="27a9b0ab1e01b34a4f8474b28f1a257a2486eebfe5e203b35b2d4007826c62c9" Jan 28 08:03:47 crc kubenswrapper[4776]: E0128 08:03:47.873257 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27a9b0ab1e01b34a4f8474b28f1a257a2486eebfe5e203b35b2d4007826c62c9\": container with ID starting with 27a9b0ab1e01b34a4f8474b28f1a257a2486eebfe5e203b35b2d4007826c62c9 not found: ID does not exist" containerID="27a9b0ab1e01b34a4f8474b28f1a257a2486eebfe5e203b35b2d4007826c62c9" Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.873297 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a9b0ab1e01b34a4f8474b28f1a257a2486eebfe5e203b35b2d4007826c62c9"} err="failed to get container status \"27a9b0ab1e01b34a4f8474b28f1a257a2486eebfe5e203b35b2d4007826c62c9\": rpc error: code = NotFound desc = could not find container \"27a9b0ab1e01b34a4f8474b28f1a257a2486eebfe5e203b35b2d4007826c62c9\": container with ID starting with 27a9b0ab1e01b34a4f8474b28f1a257a2486eebfe5e203b35b2d4007826c62c9 not found: ID does not exist" Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.873323 4776 scope.go:117] "RemoveContainer" containerID="53376ce57de41d5b750f20dfb191258904e45e454f044ce00a017b9938cec1b6" Jan 28 08:03:47 crc kubenswrapper[4776]: E0128 08:03:47.873702 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53376ce57de41d5b750f20dfb191258904e45e454f044ce00a017b9938cec1b6\": container with ID starting with 53376ce57de41d5b750f20dfb191258904e45e454f044ce00a017b9938cec1b6 not found: ID does not exist" containerID="53376ce57de41d5b750f20dfb191258904e45e454f044ce00a017b9938cec1b6" Jan 28 08:03:47 crc kubenswrapper[4776]: I0128 08:03:47.873755 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53376ce57de41d5b750f20dfb191258904e45e454f044ce00a017b9938cec1b6"} err="failed to get container status \"53376ce57de41d5b750f20dfb191258904e45e454f044ce00a017b9938cec1b6\": rpc error: code = NotFound desc = could not find container \"53376ce57de41d5b750f20dfb191258904e45e454f044ce00a017b9938cec1b6\": container with ID starting with 53376ce57de41d5b750f20dfb191258904e45e454f044ce00a017b9938cec1b6 not found: ID does not exist" Jan 28 08:03:49 crc kubenswrapper[4776]: I0128 08:03:49.318501 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="312f8070-69fa-4ed5-9a9d-31fa78db4969" path="/var/lib/kubelet/pods/312f8070-69fa-4ed5-9a9d-31fa78db4969/volumes" Jan 28 08:05:33 crc kubenswrapper[4776]: I0128 08:05:33.853004 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:05:33 crc kubenswrapper[4776]: I0128 08:05:33.853653 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:06:03 crc kubenswrapper[4776]: I0128 08:06:03.852401 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:06:03 crc kubenswrapper[4776]: I0128 08:06:03.852949 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:06:33 crc kubenswrapper[4776]: I0128 08:06:33.852691 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:06:33 crc kubenswrapper[4776]: I0128 08:06:33.853415 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:06:33 crc kubenswrapper[4776]: I0128 08:06:33.853467 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 08:06:33 crc kubenswrapper[4776]: I0128 08:06:33.854308 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 08:06:33 crc kubenswrapper[4776]: I0128 08:06:33.854377 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" gracePeriod=600 Jan 28 08:06:33 crc kubenswrapper[4776]: E0128 08:06:33.980150 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:06:34 crc kubenswrapper[4776]: I0128 08:06:34.402174 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" exitCode=0 Jan 28 08:06:34 crc kubenswrapper[4776]: I0128 08:06:34.402275 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a"} Jan 28 08:06:34 crc kubenswrapper[4776]: I0128 08:06:34.402616 4776 scope.go:117] "RemoveContainer" containerID="73ee3a896c8189ebaffaf431eaef6bac020496d7e43e90b69160f650f60dceac" Jan 28 08:06:34 crc kubenswrapper[4776]: I0128 08:06:34.403182 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:06:34 crc kubenswrapper[4776]: E0128 08:06:34.403491 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:06:45 crc kubenswrapper[4776]: I0128 08:06:45.304329 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:06:45 crc kubenswrapper[4776]: E0128 08:06:45.305458 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:06:58 crc kubenswrapper[4776]: I0128 08:06:58.304856 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:06:58 crc kubenswrapper[4776]: E0128 08:06:58.306337 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:07:11 crc kubenswrapper[4776]: I0128 08:07:11.304691 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:07:11 crc kubenswrapper[4776]: E0128 08:07:11.305758 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:07:24 crc kubenswrapper[4776]: I0128 08:07:24.304596 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:07:24 crc kubenswrapper[4776]: E0128 08:07:24.305476 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:07:39 crc kubenswrapper[4776]: I0128 08:07:39.315178 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:07:39 crc kubenswrapper[4776]: E0128 08:07:39.316505 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:07:54 crc kubenswrapper[4776]: I0128 08:07:54.304438 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:07:54 crc kubenswrapper[4776]: E0128 08:07:54.305382 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:08:06 crc kubenswrapper[4776]: I0128 08:08:06.304869 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:08:06 crc kubenswrapper[4776]: E0128 08:08:06.305683 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:08:17 crc kubenswrapper[4776]: I0128 08:08:17.306012 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:08:17 crc kubenswrapper[4776]: E0128 08:08:17.306897 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:08:21 crc kubenswrapper[4776]: I0128 08:08:21.138271 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gpgbv"] Jan 28 08:08:21 crc kubenswrapper[4776]: E0128 08:08:21.139140 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312f8070-69fa-4ed5-9a9d-31fa78db4969" containerName="registry-server" Jan 28 08:08:21 crc kubenswrapper[4776]: I0128 08:08:21.139156 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="312f8070-69fa-4ed5-9a9d-31fa78db4969" containerName="registry-server" Jan 28 08:08:21 crc kubenswrapper[4776]: E0128 08:08:21.139190 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312f8070-69fa-4ed5-9a9d-31fa78db4969" containerName="extract-utilities" Jan 28 08:08:21 crc kubenswrapper[4776]: I0128 08:08:21.139211 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="312f8070-69fa-4ed5-9a9d-31fa78db4969" containerName="extract-utilities" Jan 28 08:08:21 crc kubenswrapper[4776]: E0128 08:08:21.139238 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312f8070-69fa-4ed5-9a9d-31fa78db4969" containerName="extract-content" Jan 28 08:08:21 crc kubenswrapper[4776]: I0128 08:08:21.139246 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="312f8070-69fa-4ed5-9a9d-31fa78db4969" containerName="extract-content" Jan 28 08:08:21 crc kubenswrapper[4776]: I0128 08:08:21.139504 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="312f8070-69fa-4ed5-9a9d-31fa78db4969" containerName="registry-server" Jan 28 08:08:21 crc kubenswrapper[4776]: I0128 08:08:21.141259 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gpgbv" Jan 28 08:08:21 crc kubenswrapper[4776]: I0128 08:08:21.162230 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gpgbv"] Jan 28 08:08:21 crc kubenswrapper[4776]: I0128 08:08:21.316253 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcszq\" (UniqueName: \"kubernetes.io/projected/2035a23a-3880-4f7c-aaf9-993cf51766ed-kube-api-access-vcszq\") pod \"community-operators-gpgbv\" (UID: \"2035a23a-3880-4f7c-aaf9-993cf51766ed\") " pod="openshift-marketplace/community-operators-gpgbv" Jan 28 08:08:21 crc kubenswrapper[4776]: I0128 08:08:21.316425 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2035a23a-3880-4f7c-aaf9-993cf51766ed-catalog-content\") pod \"community-operators-gpgbv\" (UID: \"2035a23a-3880-4f7c-aaf9-993cf51766ed\") " pod="openshift-marketplace/community-operators-gpgbv" Jan 28 08:08:21 crc kubenswrapper[4776]: I0128 08:08:21.317116 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2035a23a-3880-4f7c-aaf9-993cf51766ed-utilities\") pod \"community-operators-gpgbv\" (UID: \"2035a23a-3880-4f7c-aaf9-993cf51766ed\") " pod="openshift-marketplace/community-operators-gpgbv" Jan 28 08:08:21 crc kubenswrapper[4776]: I0128 08:08:21.418469 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcszq\" (UniqueName: \"kubernetes.io/projected/2035a23a-3880-4f7c-aaf9-993cf51766ed-kube-api-access-vcszq\") pod \"community-operators-gpgbv\" (UID: \"2035a23a-3880-4f7c-aaf9-993cf51766ed\") " pod="openshift-marketplace/community-operators-gpgbv" Jan 28 08:08:21 crc kubenswrapper[4776]: I0128 08:08:21.418614 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2035a23a-3880-4f7c-aaf9-993cf51766ed-catalog-content\") pod \"community-operators-gpgbv\" (UID: \"2035a23a-3880-4f7c-aaf9-993cf51766ed\") " pod="openshift-marketplace/community-operators-gpgbv" Jan 28 08:08:21 crc kubenswrapper[4776]: I0128 08:08:21.418701 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2035a23a-3880-4f7c-aaf9-993cf51766ed-utilities\") pod \"community-operators-gpgbv\" (UID: \"2035a23a-3880-4f7c-aaf9-993cf51766ed\") " pod="openshift-marketplace/community-operators-gpgbv" Jan 28 08:08:21 crc kubenswrapper[4776]: I0128 08:08:21.419297 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2035a23a-3880-4f7c-aaf9-993cf51766ed-utilities\") pod \"community-operators-gpgbv\" (UID: \"2035a23a-3880-4f7c-aaf9-993cf51766ed\") " pod="openshift-marketplace/community-operators-gpgbv" Jan 28 08:08:21 crc kubenswrapper[4776]: I0128 08:08:21.419469 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2035a23a-3880-4f7c-aaf9-993cf51766ed-catalog-content\") pod \"community-operators-gpgbv\" (UID: \"2035a23a-3880-4f7c-aaf9-993cf51766ed\") " pod="openshift-marketplace/community-operators-gpgbv" Jan 28 08:08:21 crc kubenswrapper[4776]: I0128 08:08:21.439091 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcszq\" (UniqueName: \"kubernetes.io/projected/2035a23a-3880-4f7c-aaf9-993cf51766ed-kube-api-access-vcszq\") pod \"community-operators-gpgbv\" (UID: \"2035a23a-3880-4f7c-aaf9-993cf51766ed\") " pod="openshift-marketplace/community-operators-gpgbv" Jan 28 08:08:21 crc kubenswrapper[4776]: I0128 08:08:21.527705 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gpgbv" Jan 28 08:08:22 crc kubenswrapper[4776]: I0128 08:08:22.072392 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gpgbv"] Jan 28 08:08:22 crc kubenswrapper[4776]: I0128 08:08:22.582844 4776 generic.go:334] "Generic (PLEG): container finished" podID="2035a23a-3880-4f7c-aaf9-993cf51766ed" containerID="8fbc443b6bbd8698e8fd512fa6426c675812f77a932309bd86d8e04e166ff901" exitCode=0 Jan 28 08:08:22 crc kubenswrapper[4776]: I0128 08:08:22.582941 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpgbv" event={"ID":"2035a23a-3880-4f7c-aaf9-993cf51766ed","Type":"ContainerDied","Data":"8fbc443b6bbd8698e8fd512fa6426c675812f77a932309bd86d8e04e166ff901"} Jan 28 08:08:22 crc kubenswrapper[4776]: I0128 08:08:22.583272 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpgbv" event={"ID":"2035a23a-3880-4f7c-aaf9-993cf51766ed","Type":"ContainerStarted","Data":"9f35e8b6f2840d2eedd42a10848f63e96f75e41130ffcd2470b16988152603bd"} Jan 28 08:08:23 crc kubenswrapper[4776]: I0128 08:08:23.598379 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpgbv" event={"ID":"2035a23a-3880-4f7c-aaf9-993cf51766ed","Type":"ContainerStarted","Data":"1aa002f27322539ca8fda114176f37d7689203df055627b45a2966ebcc1830e8"} Jan 28 08:08:24 crc kubenswrapper[4776]: I0128 08:08:24.613839 4776 generic.go:334] "Generic (PLEG): container finished" podID="2035a23a-3880-4f7c-aaf9-993cf51766ed" containerID="1aa002f27322539ca8fda114176f37d7689203df055627b45a2966ebcc1830e8" exitCode=0 Jan 28 08:08:24 crc kubenswrapper[4776]: I0128 08:08:24.614052 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpgbv" event={"ID":"2035a23a-3880-4f7c-aaf9-993cf51766ed","Type":"ContainerDied","Data":"1aa002f27322539ca8fda114176f37d7689203df055627b45a2966ebcc1830e8"} Jan 28 08:08:25 crc kubenswrapper[4776]: I0128 08:08:25.624627 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpgbv" event={"ID":"2035a23a-3880-4f7c-aaf9-993cf51766ed","Type":"ContainerStarted","Data":"43823e794811b246250b0e3d4bf13c4ab11f91122d43f63079c46477e3e25b1c"} Jan 28 08:08:25 crc kubenswrapper[4776]: I0128 08:08:25.662228 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gpgbv" podStartSLOduration=2.235407971 podStartE2EDuration="4.662209868s" podCreationTimestamp="2026-01-28 08:08:21 +0000 UTC" firstStartedPulling="2026-01-28 08:08:22.58502896 +0000 UTC m=+4674.000689130" lastFinishedPulling="2026-01-28 08:08:25.011830867 +0000 UTC m=+4676.427491027" observedRunningTime="2026-01-28 08:08:25.644772315 +0000 UTC m=+4677.060432475" watchObservedRunningTime="2026-01-28 08:08:25.662209868 +0000 UTC m=+4677.077870028" Jan 28 08:08:29 crc kubenswrapper[4776]: I0128 08:08:29.314320 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:08:29 crc kubenswrapper[4776]: E0128 08:08:29.315418 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:08:31 crc kubenswrapper[4776]: I0128 08:08:31.528124 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gpgbv" Jan 28 08:08:31 crc kubenswrapper[4776]: I0128 08:08:31.529345 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gpgbv" Jan 28 08:08:31 crc kubenswrapper[4776]: I0128 08:08:31.599960 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gpgbv" Jan 28 08:08:31 crc kubenswrapper[4776]: I0128 08:08:31.767108 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gpgbv" Jan 28 08:08:31 crc kubenswrapper[4776]: I0128 08:08:31.851366 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gpgbv"] Jan 28 08:08:33 crc kubenswrapper[4776]: I0128 08:08:33.719246 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gpgbv" podUID="2035a23a-3880-4f7c-aaf9-993cf51766ed" containerName="registry-server" containerID="cri-o://43823e794811b246250b0e3d4bf13c4ab11f91122d43f63079c46477e3e25b1c" gracePeriod=2 Jan 28 08:08:34 crc kubenswrapper[4776]: I0128 08:08:34.729378 4776 generic.go:334] "Generic (PLEG): container finished" podID="2035a23a-3880-4f7c-aaf9-993cf51766ed" containerID="43823e794811b246250b0e3d4bf13c4ab11f91122d43f63079c46477e3e25b1c" exitCode=0 Jan 28 08:08:34 crc kubenswrapper[4776]: I0128 08:08:34.729437 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpgbv" event={"ID":"2035a23a-3880-4f7c-aaf9-993cf51766ed","Type":"ContainerDied","Data":"43823e794811b246250b0e3d4bf13c4ab11f91122d43f63079c46477e3e25b1c"} Jan 28 08:08:34 crc kubenswrapper[4776]: I0128 08:08:34.825318 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gpgbv" Jan 28 08:08:34 crc kubenswrapper[4776]: I0128 08:08:34.983517 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2035a23a-3880-4f7c-aaf9-993cf51766ed-utilities\") pod \"2035a23a-3880-4f7c-aaf9-993cf51766ed\" (UID: \"2035a23a-3880-4f7c-aaf9-993cf51766ed\") " Jan 28 08:08:34 crc kubenswrapper[4776]: I0128 08:08:34.983681 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2035a23a-3880-4f7c-aaf9-993cf51766ed-catalog-content\") pod \"2035a23a-3880-4f7c-aaf9-993cf51766ed\" (UID: \"2035a23a-3880-4f7c-aaf9-993cf51766ed\") " Jan 28 08:08:34 crc kubenswrapper[4776]: I0128 08:08:34.983750 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcszq\" (UniqueName: \"kubernetes.io/projected/2035a23a-3880-4f7c-aaf9-993cf51766ed-kube-api-access-vcszq\") pod \"2035a23a-3880-4f7c-aaf9-993cf51766ed\" (UID: \"2035a23a-3880-4f7c-aaf9-993cf51766ed\") " Jan 28 08:08:34 crc kubenswrapper[4776]: I0128 08:08:34.984950 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2035a23a-3880-4f7c-aaf9-993cf51766ed-utilities" (OuterVolumeSpecName: "utilities") pod "2035a23a-3880-4f7c-aaf9-993cf51766ed" (UID: "2035a23a-3880-4f7c-aaf9-993cf51766ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:08:34 crc kubenswrapper[4776]: I0128 08:08:34.991379 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2035a23a-3880-4f7c-aaf9-993cf51766ed-kube-api-access-vcszq" (OuterVolumeSpecName: "kube-api-access-vcszq") pod "2035a23a-3880-4f7c-aaf9-993cf51766ed" (UID: "2035a23a-3880-4f7c-aaf9-993cf51766ed"). InnerVolumeSpecName "kube-api-access-vcszq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:08:35 crc kubenswrapper[4776]: I0128 08:08:35.057201 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2035a23a-3880-4f7c-aaf9-993cf51766ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2035a23a-3880-4f7c-aaf9-993cf51766ed" (UID: "2035a23a-3880-4f7c-aaf9-993cf51766ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:08:35 crc kubenswrapper[4776]: I0128 08:08:35.085425 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2035a23a-3880-4f7c-aaf9-993cf51766ed-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 08:08:35 crc kubenswrapper[4776]: I0128 08:08:35.085461 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcszq\" (UniqueName: \"kubernetes.io/projected/2035a23a-3880-4f7c-aaf9-993cf51766ed-kube-api-access-vcszq\") on node \"crc\" DevicePath \"\"" Jan 28 08:08:35 crc kubenswrapper[4776]: I0128 08:08:35.085472 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2035a23a-3880-4f7c-aaf9-993cf51766ed-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 08:08:35 crc kubenswrapper[4776]: I0128 08:08:35.743156 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpgbv" event={"ID":"2035a23a-3880-4f7c-aaf9-993cf51766ed","Type":"ContainerDied","Data":"9f35e8b6f2840d2eedd42a10848f63e96f75e41130ffcd2470b16988152603bd"} Jan 28 08:08:35 crc kubenswrapper[4776]: I0128 08:08:35.743237 4776 scope.go:117] "RemoveContainer" containerID="43823e794811b246250b0e3d4bf13c4ab11f91122d43f63079c46477e3e25b1c" Jan 28 08:08:35 crc kubenswrapper[4776]: I0128 08:08:35.743180 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gpgbv" Jan 28 08:08:35 crc kubenswrapper[4776]: I0128 08:08:35.774743 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gpgbv"] Jan 28 08:08:35 crc kubenswrapper[4776]: I0128 08:08:35.783836 4776 scope.go:117] "RemoveContainer" containerID="1aa002f27322539ca8fda114176f37d7689203df055627b45a2966ebcc1830e8" Jan 28 08:08:35 crc kubenswrapper[4776]: I0128 08:08:35.786234 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gpgbv"] Jan 28 08:08:35 crc kubenswrapper[4776]: I0128 08:08:35.811503 4776 scope.go:117] "RemoveContainer" containerID="8fbc443b6bbd8698e8fd512fa6426c675812f77a932309bd86d8e04e166ff901" Jan 28 08:08:37 crc kubenswrapper[4776]: I0128 08:08:37.324869 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2035a23a-3880-4f7c-aaf9-993cf51766ed" path="/var/lib/kubelet/pods/2035a23a-3880-4f7c-aaf9-993cf51766ed/volumes" Jan 28 08:08:40 crc kubenswrapper[4776]: I0128 08:08:40.305637 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:08:40 crc kubenswrapper[4776]: E0128 08:08:40.306248 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:08:52 crc kubenswrapper[4776]: I0128 08:08:52.306441 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:08:52 crc kubenswrapper[4776]: E0128 08:08:52.307963 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:09:07 crc kubenswrapper[4776]: I0128 08:09:07.305984 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:09:07 crc kubenswrapper[4776]: E0128 08:09:07.306897 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:09:09 crc kubenswrapper[4776]: I0128 08:09:09.353710 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bfs7z"] Jan 28 08:09:09 crc kubenswrapper[4776]: E0128 08:09:09.356182 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2035a23a-3880-4f7c-aaf9-993cf51766ed" containerName="extract-utilities" Jan 28 08:09:09 crc kubenswrapper[4776]: I0128 08:09:09.356214 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2035a23a-3880-4f7c-aaf9-993cf51766ed" containerName="extract-utilities" Jan 28 08:09:09 crc kubenswrapper[4776]: E0128 08:09:09.356254 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2035a23a-3880-4f7c-aaf9-993cf51766ed" containerName="extract-content" Jan 28 08:09:09 crc kubenswrapper[4776]: I0128 08:09:09.356267 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2035a23a-3880-4f7c-aaf9-993cf51766ed" containerName="extract-content" Jan 28 08:09:09 crc kubenswrapper[4776]: E0128 08:09:09.356301 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2035a23a-3880-4f7c-aaf9-993cf51766ed" containerName="registry-server" Jan 28 08:09:09 crc kubenswrapper[4776]: I0128 08:09:09.356312 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2035a23a-3880-4f7c-aaf9-993cf51766ed" containerName="registry-server" Jan 28 08:09:09 crc kubenswrapper[4776]: I0128 08:09:09.356732 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2035a23a-3880-4f7c-aaf9-993cf51766ed" containerName="registry-server" Jan 28 08:09:09 crc kubenswrapper[4776]: I0128 08:09:09.360177 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bfs7z" Jan 28 08:09:09 crc kubenswrapper[4776]: I0128 08:09:09.363708 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bfs7z"] Jan 28 08:09:09 crc kubenswrapper[4776]: I0128 08:09:09.513167 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6bbc905-926a-485a-a0da-4ac35f39505b-utilities\") pod \"certified-operators-bfs7z\" (UID: \"d6bbc905-926a-485a-a0da-4ac35f39505b\") " pod="openshift-marketplace/certified-operators-bfs7z" Jan 28 08:09:09 crc kubenswrapper[4776]: I0128 08:09:09.513345 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6bbc905-926a-485a-a0da-4ac35f39505b-catalog-content\") pod \"certified-operators-bfs7z\" (UID: \"d6bbc905-926a-485a-a0da-4ac35f39505b\") " pod="openshift-marketplace/certified-operators-bfs7z" Jan 28 08:09:09 crc kubenswrapper[4776]: I0128 08:09:09.513402 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mdt7\" (UniqueName: \"kubernetes.io/projected/d6bbc905-926a-485a-a0da-4ac35f39505b-kube-api-access-4mdt7\") pod \"certified-operators-bfs7z\" (UID: \"d6bbc905-926a-485a-a0da-4ac35f39505b\") " pod="openshift-marketplace/certified-operators-bfs7z" Jan 28 08:09:09 crc kubenswrapper[4776]: I0128 08:09:09.615235 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mdt7\" (UniqueName: \"kubernetes.io/projected/d6bbc905-926a-485a-a0da-4ac35f39505b-kube-api-access-4mdt7\") pod \"certified-operators-bfs7z\" (UID: \"d6bbc905-926a-485a-a0da-4ac35f39505b\") " pod="openshift-marketplace/certified-operators-bfs7z" Jan 28 08:09:09 crc kubenswrapper[4776]: I0128 08:09:09.615323 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6bbc905-926a-485a-a0da-4ac35f39505b-utilities\") pod \"certified-operators-bfs7z\" (UID: \"d6bbc905-926a-485a-a0da-4ac35f39505b\") " pod="openshift-marketplace/certified-operators-bfs7z" Jan 28 08:09:09 crc kubenswrapper[4776]: I0128 08:09:09.615470 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6bbc905-926a-485a-a0da-4ac35f39505b-catalog-content\") pod \"certified-operators-bfs7z\" (UID: \"d6bbc905-926a-485a-a0da-4ac35f39505b\") " pod="openshift-marketplace/certified-operators-bfs7z" Jan 28 08:09:09 crc kubenswrapper[4776]: I0128 08:09:09.615877 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6bbc905-926a-485a-a0da-4ac35f39505b-utilities\") pod \"certified-operators-bfs7z\" (UID: \"d6bbc905-926a-485a-a0da-4ac35f39505b\") " pod="openshift-marketplace/certified-operators-bfs7z" Jan 28 08:09:09 crc kubenswrapper[4776]: I0128 08:09:09.615996 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6bbc905-926a-485a-a0da-4ac35f39505b-catalog-content\") pod \"certified-operators-bfs7z\" (UID: \"d6bbc905-926a-485a-a0da-4ac35f39505b\") " pod="openshift-marketplace/certified-operators-bfs7z" Jan 28 08:09:09 crc kubenswrapper[4776]: I0128 08:09:09.636462 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mdt7\" (UniqueName: \"kubernetes.io/projected/d6bbc905-926a-485a-a0da-4ac35f39505b-kube-api-access-4mdt7\") pod \"certified-operators-bfs7z\" (UID: \"d6bbc905-926a-485a-a0da-4ac35f39505b\") " pod="openshift-marketplace/certified-operators-bfs7z" Jan 28 08:09:09 crc kubenswrapper[4776]: I0128 08:09:09.687836 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bfs7z" Jan 28 08:09:10 crc kubenswrapper[4776]: I0128 08:09:10.259799 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bfs7z"] Jan 28 08:09:11 crc kubenswrapper[4776]: I0128 08:09:11.142470 4776 generic.go:334] "Generic (PLEG): container finished" podID="d6bbc905-926a-485a-a0da-4ac35f39505b" containerID="b8ff56eeb2166618f8543a866c518c993883e788e989c76ea22b9b6c7af815f9" exitCode=0 Jan 28 08:09:11 crc kubenswrapper[4776]: I0128 08:09:11.142530 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bfs7z" event={"ID":"d6bbc905-926a-485a-a0da-4ac35f39505b","Type":"ContainerDied","Data":"b8ff56eeb2166618f8543a866c518c993883e788e989c76ea22b9b6c7af815f9"} Jan 28 08:09:11 crc kubenswrapper[4776]: I0128 08:09:11.142589 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bfs7z" event={"ID":"d6bbc905-926a-485a-a0da-4ac35f39505b","Type":"ContainerStarted","Data":"f88e7d47bdea120b18fe23c55862b706d3cbf88271956d40250245218e51e6aa"} Jan 28 08:09:11 crc kubenswrapper[4776]: I0128 08:09:11.144276 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 08:09:16 crc kubenswrapper[4776]: I0128 08:09:16.184899 4776 generic.go:334] "Generic (PLEG): container finished" podID="d6bbc905-926a-485a-a0da-4ac35f39505b" containerID="205affda8b0b67c01626028634b3c39765f2bca14c8c0da78a7bf5d91b5f5886" exitCode=0 Jan 28 08:09:16 crc kubenswrapper[4776]: I0128 08:09:16.184986 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bfs7z" event={"ID":"d6bbc905-926a-485a-a0da-4ac35f39505b","Type":"ContainerDied","Data":"205affda8b0b67c01626028634b3c39765f2bca14c8c0da78a7bf5d91b5f5886"} Jan 28 08:09:17 crc kubenswrapper[4776]: I0128 08:09:17.220936 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bfs7z" podStartSLOduration=2.796361118 podStartE2EDuration="8.22091932s" podCreationTimestamp="2026-01-28 08:09:09 +0000 UTC" firstStartedPulling="2026-01-28 08:09:11.144011544 +0000 UTC m=+4722.559671704" lastFinishedPulling="2026-01-28 08:09:16.568569746 +0000 UTC m=+4727.984229906" observedRunningTime="2026-01-28 08:09:17.215686638 +0000 UTC m=+4728.631346798" watchObservedRunningTime="2026-01-28 08:09:17.22091932 +0000 UTC m=+4728.636579480" Jan 28 08:09:17 crc kubenswrapper[4776]: I0128 08:09:17.748798 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kxd8w"] Jan 28 08:09:17 crc kubenswrapper[4776]: I0128 08:09:17.752802 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxd8w" Jan 28 08:09:17 crc kubenswrapper[4776]: I0128 08:09:17.771565 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kxd8w"] Jan 28 08:09:17 crc kubenswrapper[4776]: I0128 08:09:17.875388 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-utilities\") pod \"redhat-operators-kxd8w\" (UID: \"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c\") " pod="openshift-marketplace/redhat-operators-kxd8w" Jan 28 08:09:17 crc kubenswrapper[4776]: I0128 08:09:17.875452 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h27pf\" (UniqueName: \"kubernetes.io/projected/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-kube-api-access-h27pf\") pod \"redhat-operators-kxd8w\" (UID: \"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c\") " pod="openshift-marketplace/redhat-operators-kxd8w" Jan 28 08:09:17 crc kubenswrapper[4776]: I0128 08:09:17.875617 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-catalog-content\") pod \"redhat-operators-kxd8w\" (UID: \"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c\") " pod="openshift-marketplace/redhat-operators-kxd8w" Jan 28 08:09:17 crc kubenswrapper[4776]: I0128 08:09:17.976915 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-catalog-content\") pod \"redhat-operators-kxd8w\" (UID: \"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c\") " pod="openshift-marketplace/redhat-operators-kxd8w" Jan 28 08:09:17 crc kubenswrapper[4776]: I0128 08:09:17.977245 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-utilities\") pod \"redhat-operators-kxd8w\" (UID: \"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c\") " pod="openshift-marketplace/redhat-operators-kxd8w" Jan 28 08:09:17 crc kubenswrapper[4776]: I0128 08:09:17.977283 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h27pf\" (UniqueName: \"kubernetes.io/projected/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-kube-api-access-h27pf\") pod \"redhat-operators-kxd8w\" (UID: \"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c\") " pod="openshift-marketplace/redhat-operators-kxd8w" Jan 28 08:09:17 crc kubenswrapper[4776]: I0128 08:09:17.978160 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-catalog-content\") pod \"redhat-operators-kxd8w\" (UID: \"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c\") " pod="openshift-marketplace/redhat-operators-kxd8w" Jan 28 08:09:17 crc kubenswrapper[4776]: I0128 08:09:17.978421 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-utilities\") pod \"redhat-operators-kxd8w\" (UID: \"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c\") " pod="openshift-marketplace/redhat-operators-kxd8w" Jan 28 08:09:17 crc kubenswrapper[4776]: I0128 08:09:17.998947 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h27pf\" (UniqueName: \"kubernetes.io/projected/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-kube-api-access-h27pf\") pod \"redhat-operators-kxd8w\" (UID: \"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c\") " pod="openshift-marketplace/redhat-operators-kxd8w" Jan 28 08:09:18 crc kubenswrapper[4776]: I0128 08:09:18.077599 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxd8w" Jan 28 08:09:18 crc kubenswrapper[4776]: I0128 08:09:18.230642 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bfs7z" event={"ID":"d6bbc905-926a-485a-a0da-4ac35f39505b","Type":"ContainerStarted","Data":"050ce377d652ce94b68ad816d506a71fd5770fa65c50bcc24665687a94371105"} Jan 28 08:09:18 crc kubenswrapper[4776]: I0128 08:09:18.541083 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kxd8w"] Jan 28 08:09:19 crc kubenswrapper[4776]: I0128 08:09:19.242649 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxd8w" event={"ID":"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c","Type":"ContainerStarted","Data":"2c80b4831fdaf1b0e845a65376c2f56e2f411086957c2e777b96e29654918f53"} Jan 28 08:09:19 crc kubenswrapper[4776]: I0128 08:09:19.314716 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:09:19 crc kubenswrapper[4776]: E0128 08:09:19.315083 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:09:19 crc kubenswrapper[4776]: I0128 08:09:19.688727 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bfs7z" Jan 28 08:09:19 crc kubenswrapper[4776]: I0128 08:09:19.688796 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bfs7z" Jan 28 08:09:19 crc kubenswrapper[4776]: I0128 08:09:19.738569 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bfs7z" Jan 28 08:09:20 crc kubenswrapper[4776]: I0128 08:09:20.257123 4776 generic.go:334] "Generic (PLEG): container finished" podID="a4cb8eec-2cbe-483b-96db-e52b4cf8d92c" containerID="2b6e8cd07b6978e1a4683a527dd094e7911dffbdfe1fb91fd7c30ae567e8ec91" exitCode=0 Jan 28 08:09:20 crc kubenswrapper[4776]: I0128 08:09:20.259943 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxd8w" event={"ID":"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c","Type":"ContainerDied","Data":"2b6e8cd07b6978e1a4683a527dd094e7911dffbdfe1fb91fd7c30ae567e8ec91"} Jan 28 08:09:22 crc kubenswrapper[4776]: I0128 08:09:22.277130 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxd8w" event={"ID":"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c","Type":"ContainerStarted","Data":"fdbdf2ab63576e4c2b612bc0d0e975bd8e05cd271db954cf1f89546daf68f901"} Jan 28 08:09:23 crc kubenswrapper[4776]: I0128 08:09:23.289278 4776 generic.go:334] "Generic (PLEG): container finished" podID="a4cb8eec-2cbe-483b-96db-e52b4cf8d92c" containerID="fdbdf2ab63576e4c2b612bc0d0e975bd8e05cd271db954cf1f89546daf68f901" exitCode=0 Jan 28 08:09:23 crc kubenswrapper[4776]: I0128 08:09:23.289368 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxd8w" event={"ID":"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c","Type":"ContainerDied","Data":"fdbdf2ab63576e4c2b612bc0d0e975bd8e05cd271db954cf1f89546daf68f901"} Jan 28 08:09:24 crc kubenswrapper[4776]: I0128 08:09:24.303178 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxd8w" event={"ID":"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c","Type":"ContainerStarted","Data":"759d25dd9a82dca8e4433b7cb006e87afc51652aa19f3e78d776dbf3f15d1097"} Jan 28 08:09:24 crc kubenswrapper[4776]: I0128 08:09:24.330903 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kxd8w" podStartSLOduration=3.577475156 podStartE2EDuration="7.330882035s" podCreationTimestamp="2026-01-28 08:09:17 +0000 UTC" firstStartedPulling="2026-01-28 08:09:20.261697562 +0000 UTC m=+4731.677357732" lastFinishedPulling="2026-01-28 08:09:24.015104461 +0000 UTC m=+4735.430764611" observedRunningTime="2026-01-28 08:09:24.325248342 +0000 UTC m=+4735.740908502" watchObservedRunningTime="2026-01-28 08:09:24.330882035 +0000 UTC m=+4735.746542195" Jan 28 08:09:28 crc kubenswrapper[4776]: I0128 08:09:28.078089 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kxd8w" Jan 28 08:09:28 crc kubenswrapper[4776]: I0128 08:09:28.078884 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kxd8w" Jan 28 08:09:29 crc kubenswrapper[4776]: I0128 08:09:29.737928 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bfs7z" Jan 28 08:09:29 crc kubenswrapper[4776]: I0128 08:09:29.809430 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bfs7z"] Jan 28 08:09:29 crc kubenswrapper[4776]: I0128 08:09:29.860297 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sfr2v"] Jan 28 08:09:29 crc kubenswrapper[4776]: I0128 08:09:29.860573 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sfr2v" podUID="fbe549a5-e8f7-4868-9420-f64ff851880e" containerName="registry-server" containerID="cri-o://81b0d802a83ac4cfccc10871f6f7ef1473e1427b30b06e17bc8fc73906f1e636" gracePeriod=2 Jan 28 08:09:29 crc kubenswrapper[4776]: I0128 08:09:29.987846 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kxd8w" podUID="a4cb8eec-2cbe-483b-96db-e52b4cf8d92c" containerName="registry-server" probeResult="failure" output=< Jan 28 08:09:29 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Jan 28 08:09:29 crc kubenswrapper[4776]: > Jan 28 08:09:30 crc kubenswrapper[4776]: I0128 08:09:30.305604 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:09:30 crc kubenswrapper[4776]: E0128 08:09:30.306310 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:09:30 crc kubenswrapper[4776]: I0128 08:09:30.361879 4776 generic.go:334] "Generic (PLEG): container finished" podID="fbe549a5-e8f7-4868-9420-f64ff851880e" containerID="81b0d802a83ac4cfccc10871f6f7ef1473e1427b30b06e17bc8fc73906f1e636" exitCode=0 Jan 28 08:09:30 crc kubenswrapper[4776]: I0128 08:09:30.362905 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfr2v" event={"ID":"fbe549a5-e8f7-4868-9420-f64ff851880e","Type":"ContainerDied","Data":"81b0d802a83ac4cfccc10871f6f7ef1473e1427b30b06e17bc8fc73906f1e636"} Jan 28 08:09:30 crc kubenswrapper[4776]: I0128 08:09:30.884640 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfr2v" Jan 28 08:09:30 crc kubenswrapper[4776]: I0128 08:09:30.989502 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe549a5-e8f7-4868-9420-f64ff851880e-catalog-content\") pod \"fbe549a5-e8f7-4868-9420-f64ff851880e\" (UID: \"fbe549a5-e8f7-4868-9420-f64ff851880e\") " Jan 28 08:09:30 crc kubenswrapper[4776]: I0128 08:09:30.989671 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqb27\" (UniqueName: \"kubernetes.io/projected/fbe549a5-e8f7-4868-9420-f64ff851880e-kube-api-access-gqb27\") pod \"fbe549a5-e8f7-4868-9420-f64ff851880e\" (UID: \"fbe549a5-e8f7-4868-9420-f64ff851880e\") " Jan 28 08:09:30 crc kubenswrapper[4776]: I0128 08:09:30.989706 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe549a5-e8f7-4868-9420-f64ff851880e-utilities\") pod \"fbe549a5-e8f7-4868-9420-f64ff851880e\" (UID: \"fbe549a5-e8f7-4868-9420-f64ff851880e\") " Jan 28 08:09:30 crc kubenswrapper[4776]: I0128 08:09:30.990753 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe549a5-e8f7-4868-9420-f64ff851880e-utilities" (OuterVolumeSpecName: "utilities") pod "fbe549a5-e8f7-4868-9420-f64ff851880e" (UID: "fbe549a5-e8f7-4868-9420-f64ff851880e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:09:31 crc kubenswrapper[4776]: I0128 08:09:31.015734 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbe549a5-e8f7-4868-9420-f64ff851880e-kube-api-access-gqb27" (OuterVolumeSpecName: "kube-api-access-gqb27") pod "fbe549a5-e8f7-4868-9420-f64ff851880e" (UID: "fbe549a5-e8f7-4868-9420-f64ff851880e"). InnerVolumeSpecName "kube-api-access-gqb27". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:09:31 crc kubenswrapper[4776]: I0128 08:09:31.042635 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe549a5-e8f7-4868-9420-f64ff851880e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbe549a5-e8f7-4868-9420-f64ff851880e" (UID: "fbe549a5-e8f7-4868-9420-f64ff851880e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:09:31 crc kubenswrapper[4776]: I0128 08:09:31.092191 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqb27\" (UniqueName: \"kubernetes.io/projected/fbe549a5-e8f7-4868-9420-f64ff851880e-kube-api-access-gqb27\") on node \"crc\" DevicePath \"\"" Jan 28 08:09:31 crc kubenswrapper[4776]: I0128 08:09:31.092511 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe549a5-e8f7-4868-9420-f64ff851880e-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 08:09:31 crc kubenswrapper[4776]: I0128 08:09:31.092520 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe549a5-e8f7-4868-9420-f64ff851880e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 08:09:31 crc kubenswrapper[4776]: I0128 08:09:31.375006 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfr2v" event={"ID":"fbe549a5-e8f7-4868-9420-f64ff851880e","Type":"ContainerDied","Data":"8ff9ed36f35f5355cedce14b0a7e92329409441de3e74aa1c2fc7bd4f27a0dcb"} Jan 28 08:09:31 crc kubenswrapper[4776]: I0128 08:09:31.375052 4776 scope.go:117] "RemoveContainer" containerID="81b0d802a83ac4cfccc10871f6f7ef1473e1427b30b06e17bc8fc73906f1e636" Jan 28 08:09:31 crc kubenswrapper[4776]: I0128 08:09:31.375159 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfr2v" Jan 28 08:09:31 crc kubenswrapper[4776]: I0128 08:09:31.400294 4776 scope.go:117] "RemoveContainer" containerID="f4d509684a872610b3c457e40c9c0f5f25e0b2fdca819c50ea3b8c31b51b2ad8" Jan 28 08:09:31 crc kubenswrapper[4776]: I0128 08:09:31.416591 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sfr2v"] Jan 28 08:09:31 crc kubenswrapper[4776]: I0128 08:09:31.420021 4776 scope.go:117] "RemoveContainer" containerID="6e612e5cfd3ce240aae5fe861d6e696bf4a1c284c305b831cb64c6a6526bcf5b" Jan 28 08:09:31 crc kubenswrapper[4776]: I0128 08:09:31.427726 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sfr2v"] Jan 28 08:09:33 crc kubenswrapper[4776]: I0128 08:09:33.324685 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbe549a5-e8f7-4868-9420-f64ff851880e" path="/var/lib/kubelet/pods/fbe549a5-e8f7-4868-9420-f64ff851880e/volumes" Jan 28 08:09:38 crc kubenswrapper[4776]: I0128 08:09:38.160881 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kxd8w" Jan 28 08:09:38 crc kubenswrapper[4776]: I0128 08:09:38.227520 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kxd8w" Jan 28 08:09:39 crc kubenswrapper[4776]: I0128 08:09:39.405005 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kxd8w"] Jan 28 08:09:39 crc kubenswrapper[4776]: I0128 08:09:39.460351 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kxd8w" podUID="a4cb8eec-2cbe-483b-96db-e52b4cf8d92c" containerName="registry-server" containerID="cri-o://759d25dd9a82dca8e4433b7cb006e87afc51652aa19f3e78d776dbf3f15d1097" gracePeriod=2 Jan 28 08:09:40 crc kubenswrapper[4776]: I0128 08:09:40.476425 4776 generic.go:334] "Generic (PLEG): container finished" podID="a4cb8eec-2cbe-483b-96db-e52b4cf8d92c" containerID="759d25dd9a82dca8e4433b7cb006e87afc51652aa19f3e78d776dbf3f15d1097" exitCode=0 Jan 28 08:09:40 crc kubenswrapper[4776]: I0128 08:09:40.477355 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxd8w" event={"ID":"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c","Type":"ContainerDied","Data":"759d25dd9a82dca8e4433b7cb006e87afc51652aa19f3e78d776dbf3f15d1097"} Jan 28 08:09:40 crc kubenswrapper[4776]: I0128 08:09:40.919254 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxd8w" Jan 28 08:09:41 crc kubenswrapper[4776]: I0128 08:09:41.010314 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-utilities\") pod \"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c\" (UID: \"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c\") " Jan 28 08:09:41 crc kubenswrapper[4776]: I0128 08:09:41.010577 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-catalog-content\") pod \"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c\" (UID: \"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c\") " Jan 28 08:09:41 crc kubenswrapper[4776]: I0128 08:09:41.010680 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h27pf\" (UniqueName: \"kubernetes.io/projected/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-kube-api-access-h27pf\") pod \"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c\" (UID: \"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c\") " Jan 28 08:09:41 crc kubenswrapper[4776]: I0128 08:09:41.011784 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-utilities" (OuterVolumeSpecName: "utilities") pod "a4cb8eec-2cbe-483b-96db-e52b4cf8d92c" (UID: "a4cb8eec-2cbe-483b-96db-e52b4cf8d92c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:09:41 crc kubenswrapper[4776]: I0128 08:09:41.018394 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-kube-api-access-h27pf" (OuterVolumeSpecName: "kube-api-access-h27pf") pod "a4cb8eec-2cbe-483b-96db-e52b4cf8d92c" (UID: "a4cb8eec-2cbe-483b-96db-e52b4cf8d92c"). InnerVolumeSpecName "kube-api-access-h27pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:09:41 crc kubenswrapper[4776]: I0128 08:09:41.113845 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h27pf\" (UniqueName: \"kubernetes.io/projected/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-kube-api-access-h27pf\") on node \"crc\" DevicePath \"\"" Jan 28 08:09:41 crc kubenswrapper[4776]: I0128 08:09:41.113892 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 08:09:41 crc kubenswrapper[4776]: I0128 08:09:41.155642 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4cb8eec-2cbe-483b-96db-e52b4cf8d92c" (UID: "a4cb8eec-2cbe-483b-96db-e52b4cf8d92c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:09:41 crc kubenswrapper[4776]: I0128 08:09:41.215171 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 08:09:41 crc kubenswrapper[4776]: I0128 08:09:41.488358 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxd8w" event={"ID":"a4cb8eec-2cbe-483b-96db-e52b4cf8d92c","Type":"ContainerDied","Data":"2c80b4831fdaf1b0e845a65376c2f56e2f411086957c2e777b96e29654918f53"} Jan 28 08:09:41 crc kubenswrapper[4776]: I0128 08:09:41.488793 4776 scope.go:117] "RemoveContainer" containerID="759d25dd9a82dca8e4433b7cb006e87afc51652aa19f3e78d776dbf3f15d1097" Jan 28 08:09:41 crc kubenswrapper[4776]: I0128 08:09:41.488434 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxd8w" Jan 28 08:09:41 crc kubenswrapper[4776]: I0128 08:09:41.531104 4776 scope.go:117] "RemoveContainer" containerID="fdbdf2ab63576e4c2b612bc0d0e975bd8e05cd271db954cf1f89546daf68f901" Jan 28 08:09:41 crc kubenswrapper[4776]: I0128 08:09:41.532391 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kxd8w"] Jan 28 08:09:41 crc kubenswrapper[4776]: I0128 08:09:41.545278 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kxd8w"] Jan 28 08:09:41 crc kubenswrapper[4776]: I0128 08:09:41.554399 4776 scope.go:117] "RemoveContainer" containerID="2b6e8cd07b6978e1a4683a527dd094e7911dffbdfe1fb91fd7c30ae567e8ec91" Jan 28 08:09:42 crc kubenswrapper[4776]: I0128 08:09:42.304734 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:09:42 crc kubenswrapper[4776]: E0128 08:09:42.305309 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:09:43 crc kubenswrapper[4776]: I0128 08:09:43.319454 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4cb8eec-2cbe-483b-96db-e52b4cf8d92c" path="/var/lib/kubelet/pods/a4cb8eec-2cbe-483b-96db-e52b4cf8d92c/volumes" Jan 28 08:09:53 crc kubenswrapper[4776]: I0128 08:09:53.304663 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:09:53 crc kubenswrapper[4776]: E0128 08:09:53.305486 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:10:04 crc kubenswrapper[4776]: I0128 08:10:04.304786 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:10:04 crc kubenswrapper[4776]: E0128 08:10:04.306190 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:10:18 crc kubenswrapper[4776]: I0128 08:10:18.304817 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:10:18 crc kubenswrapper[4776]: E0128 08:10:18.305723 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:10:29 crc kubenswrapper[4776]: I0128 08:10:29.312917 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:10:29 crc kubenswrapper[4776]: E0128 08:10:29.315510 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:10:42 crc kubenswrapper[4776]: I0128 08:10:42.304862 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:10:42 crc kubenswrapper[4776]: E0128 08:10:42.305825 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:10:56 crc kubenswrapper[4776]: I0128 08:10:56.305331 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:10:56 crc kubenswrapper[4776]: E0128 08:10:56.306655 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:11:03 crc kubenswrapper[4776]: I0128 08:11:03.371414 4776 generic.go:334] "Generic (PLEG): container finished" podID="0605b294-d429-4bfd-8924-39f8cb5cb105" containerID="7c732218fd60783b0babc90bbb805fb43a92393873f5febb8b1b0b5dbf501a07" exitCode=0 Jan 28 08:11:03 crc kubenswrapper[4776]: I0128 08:11:03.371497 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0605b294-d429-4bfd-8924-39f8cb5cb105","Type":"ContainerDied","Data":"7c732218fd60783b0babc90bbb805fb43a92393873f5febb8b1b0b5dbf501a07"} Jan 28 08:11:04 crc kubenswrapper[4776]: I0128 08:11:04.824988 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.020976 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0605b294-d429-4bfd-8924-39f8cb5cb105-test-operator-ephemeral-temporary\") pod \"0605b294-d429-4bfd-8924-39f8cb5cb105\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.021039 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0605b294-d429-4bfd-8924-39f8cb5cb105-openstack-config\") pod \"0605b294-d429-4bfd-8924-39f8cb5cb105\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.021077 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvkfj\" (UniqueName: \"kubernetes.io/projected/0605b294-d429-4bfd-8924-39f8cb5cb105-kube-api-access-hvkfj\") pod \"0605b294-d429-4bfd-8924-39f8cb5cb105\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.021129 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-ssh-key\") pod \"0605b294-d429-4bfd-8924-39f8cb5cb105\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.021236 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-openstack-config-secret\") pod \"0605b294-d429-4bfd-8924-39f8cb5cb105\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.021271 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"0605b294-d429-4bfd-8924-39f8cb5cb105\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.021342 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0605b294-d429-4bfd-8924-39f8cb5cb105-test-operator-ephemeral-workdir\") pod \"0605b294-d429-4bfd-8924-39f8cb5cb105\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.021372 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0605b294-d429-4bfd-8924-39f8cb5cb105-config-data\") pod \"0605b294-d429-4bfd-8924-39f8cb5cb105\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.021436 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-ca-certs\") pod \"0605b294-d429-4bfd-8924-39f8cb5cb105\" (UID: \"0605b294-d429-4bfd-8924-39f8cb5cb105\") " Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.022097 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0605b294-d429-4bfd-8924-39f8cb5cb105-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "0605b294-d429-4bfd-8924-39f8cb5cb105" (UID: "0605b294-d429-4bfd-8924-39f8cb5cb105"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.022715 4776 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0605b294-d429-4bfd-8924-39f8cb5cb105-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.022748 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0605b294-d429-4bfd-8924-39f8cb5cb105-config-data" (OuterVolumeSpecName: "config-data") pod "0605b294-d429-4bfd-8924-39f8cb5cb105" (UID: "0605b294-d429-4bfd-8924-39f8cb5cb105"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.029417 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "0605b294-d429-4bfd-8924-39f8cb5cb105" (UID: "0605b294-d429-4bfd-8924-39f8cb5cb105"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.029447 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0605b294-d429-4bfd-8924-39f8cb5cb105-kube-api-access-hvkfj" (OuterVolumeSpecName: "kube-api-access-hvkfj") pod "0605b294-d429-4bfd-8924-39f8cb5cb105" (UID: "0605b294-d429-4bfd-8924-39f8cb5cb105"). InnerVolumeSpecName "kube-api-access-hvkfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.049589 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "0605b294-d429-4bfd-8924-39f8cb5cb105" (UID: "0605b294-d429-4bfd-8924-39f8cb5cb105"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.052923 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0605b294-d429-4bfd-8924-39f8cb5cb105" (UID: "0605b294-d429-4bfd-8924-39f8cb5cb105"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.056700 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "0605b294-d429-4bfd-8924-39f8cb5cb105" (UID: "0605b294-d429-4bfd-8924-39f8cb5cb105"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.099002 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0605b294-d429-4bfd-8924-39f8cb5cb105-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "0605b294-d429-4bfd-8924-39f8cb5cb105" (UID: "0605b294-d429-4bfd-8924-39f8cb5cb105"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.107123 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0605b294-d429-4bfd-8924-39f8cb5cb105-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "0605b294-d429-4bfd-8924-39f8cb5cb105" (UID: "0605b294-d429-4bfd-8924-39f8cb5cb105"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.124772 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.124844 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.124860 4776 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0605b294-d429-4bfd-8924-39f8cb5cb105-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.124877 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0605b294-d429-4bfd-8924-39f8cb5cb105-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.124890 4776 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.124902 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0605b294-d429-4bfd-8924-39f8cb5cb105-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.124912 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvkfj\" (UniqueName: \"kubernetes.io/projected/0605b294-d429-4bfd-8924-39f8cb5cb105-kube-api-access-hvkfj\") on node \"crc\" DevicePath \"\"" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.124920 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0605b294-d429-4bfd-8924-39f8cb5cb105-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.162573 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.226651 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.393495 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0605b294-d429-4bfd-8924-39f8cb5cb105","Type":"ContainerDied","Data":"49258267d9dcabfa04b7c7676bb04e4b80c75105fca1f4bc9300342f767706bd"} Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.393557 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49258267d9dcabfa04b7c7676bb04e4b80c75105fca1f4bc9300342f767706bd" Jan 28 08:11:05 crc kubenswrapper[4776]: I0128 08:11:05.393565 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 28 08:11:07 crc kubenswrapper[4776]: I0128 08:11:07.977095 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 28 08:11:07 crc kubenswrapper[4776]: E0128 08:11:07.978157 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4cb8eec-2cbe-483b-96db-e52b4cf8d92c" containerName="extract-content" Jan 28 08:11:07 crc kubenswrapper[4776]: I0128 08:11:07.978178 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4cb8eec-2cbe-483b-96db-e52b4cf8d92c" containerName="extract-content" Jan 28 08:11:07 crc kubenswrapper[4776]: E0128 08:11:07.978199 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe549a5-e8f7-4868-9420-f64ff851880e" containerName="extract-content" Jan 28 08:11:07 crc kubenswrapper[4776]: I0128 08:11:07.978207 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe549a5-e8f7-4868-9420-f64ff851880e" containerName="extract-content" Jan 28 08:11:07 crc kubenswrapper[4776]: E0128 08:11:07.978226 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4cb8eec-2cbe-483b-96db-e52b4cf8d92c" containerName="extract-utilities" Jan 28 08:11:07 crc kubenswrapper[4776]: I0128 08:11:07.978234 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4cb8eec-2cbe-483b-96db-e52b4cf8d92c" containerName="extract-utilities" Jan 28 08:11:07 crc kubenswrapper[4776]: E0128 08:11:07.978253 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4cb8eec-2cbe-483b-96db-e52b4cf8d92c" containerName="registry-server" Jan 28 08:11:07 crc kubenswrapper[4776]: I0128 08:11:07.978262 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4cb8eec-2cbe-483b-96db-e52b4cf8d92c" containerName="registry-server" Jan 28 08:11:07 crc kubenswrapper[4776]: E0128 08:11:07.978273 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0605b294-d429-4bfd-8924-39f8cb5cb105" containerName="tempest-tests-tempest-tests-runner" Jan 28 08:11:07 crc kubenswrapper[4776]: I0128 08:11:07.978281 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0605b294-d429-4bfd-8924-39f8cb5cb105" containerName="tempest-tests-tempest-tests-runner" Jan 28 08:11:07 crc kubenswrapper[4776]: E0128 08:11:07.978314 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe549a5-e8f7-4868-9420-f64ff851880e" containerName="registry-server" Jan 28 08:11:07 crc kubenswrapper[4776]: I0128 08:11:07.978322 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe549a5-e8f7-4868-9420-f64ff851880e" containerName="registry-server" Jan 28 08:11:07 crc kubenswrapper[4776]: E0128 08:11:07.978346 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe549a5-e8f7-4868-9420-f64ff851880e" containerName="extract-utilities" Jan 28 08:11:07 crc kubenswrapper[4776]: I0128 08:11:07.978353 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe549a5-e8f7-4868-9420-f64ff851880e" containerName="extract-utilities" Jan 28 08:11:07 crc kubenswrapper[4776]: I0128 08:11:07.978594 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0605b294-d429-4bfd-8924-39f8cb5cb105" containerName="tempest-tests-tempest-tests-runner" Jan 28 08:11:07 crc kubenswrapper[4776]: I0128 08:11:07.978618 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4cb8eec-2cbe-483b-96db-e52b4cf8d92c" containerName="registry-server" Jan 28 08:11:07 crc kubenswrapper[4776]: I0128 08:11:07.978634 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe549a5-e8f7-4868-9420-f64ff851880e" containerName="registry-server" Jan 28 08:11:07 crc kubenswrapper[4776]: I0128 08:11:07.979463 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 08:11:07 crc kubenswrapper[4776]: I0128 08:11:07.983144 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-86mg2" Jan 28 08:11:08 crc kubenswrapper[4776]: I0128 08:11:08.000982 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 28 08:11:08 crc kubenswrapper[4776]: I0128 08:11:08.095339 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2cws\" (UniqueName: \"kubernetes.io/projected/40947156-8378-437f-935a-da00e0908508-kube-api-access-x2cws\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"40947156-8378-437f-935a-da00e0908508\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 08:11:08 crc kubenswrapper[4776]: I0128 08:11:08.095599 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"40947156-8378-437f-935a-da00e0908508\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 08:11:08 crc kubenswrapper[4776]: I0128 08:11:08.197964 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"40947156-8378-437f-935a-da00e0908508\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 08:11:08 crc kubenswrapper[4776]: I0128 08:11:08.198142 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2cws\" (UniqueName: \"kubernetes.io/projected/40947156-8378-437f-935a-da00e0908508-kube-api-access-x2cws\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"40947156-8378-437f-935a-da00e0908508\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 08:11:08 crc kubenswrapper[4776]: I0128 08:11:08.198741 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"40947156-8378-437f-935a-da00e0908508\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 08:11:08 crc kubenswrapper[4776]: I0128 08:11:08.219253 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2cws\" (UniqueName: \"kubernetes.io/projected/40947156-8378-437f-935a-da00e0908508-kube-api-access-x2cws\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"40947156-8378-437f-935a-da00e0908508\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 08:11:08 crc kubenswrapper[4776]: I0128 08:11:08.240010 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"40947156-8378-437f-935a-da00e0908508\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 08:11:08 crc kubenswrapper[4776]: I0128 08:11:08.305507 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:11:08 crc kubenswrapper[4776]: E0128 08:11:08.306196 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:11:08 crc kubenswrapper[4776]: I0128 08:11:08.312667 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 08:11:08 crc kubenswrapper[4776]: I0128 08:11:08.816062 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 28 08:11:09 crc kubenswrapper[4776]: I0128 08:11:09.433475 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"40947156-8378-437f-935a-da00e0908508","Type":"ContainerStarted","Data":"6ce649cf85df526f6dfda531805f5f6d8430c09c3a6df9b17b02f9ef69639ec5"} Jan 28 08:11:10 crc kubenswrapper[4776]: I0128 08:11:10.448792 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"40947156-8378-437f-935a-da00e0908508","Type":"ContainerStarted","Data":"5a6db8aac9fc14e90e26017838c3416427adb24c7fb835b2d62a7d0c6b7b4758"} Jan 28 08:11:10 crc kubenswrapper[4776]: I0128 08:11:10.466357 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.344266445 podStartE2EDuration="3.466329873s" podCreationTimestamp="2026-01-28 08:11:07 +0000 UTC" firstStartedPulling="2026-01-28 08:11:08.826732743 +0000 UTC m=+4840.242392943" lastFinishedPulling="2026-01-28 08:11:09.948796201 +0000 UTC m=+4841.364456371" observedRunningTime="2026-01-28 08:11:10.466089976 +0000 UTC m=+4841.881750156" watchObservedRunningTime="2026-01-28 08:11:10.466329873 +0000 UTC m=+4841.881990063" Jan 28 08:11:20 crc kubenswrapper[4776]: I0128 08:11:20.304685 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:11:20 crc kubenswrapper[4776]: E0128 08:11:20.305725 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:11:34 crc kubenswrapper[4776]: I0128 08:11:34.305245 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:11:34 crc kubenswrapper[4776]: I0128 08:11:34.711908 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"7218f61cf76b54d31c01362ba5acfec5b8d9c04a058abe1ab823351fb2870817"} Jan 28 08:11:35 crc kubenswrapper[4776]: I0128 08:11:35.953046 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k85x2/must-gather-9pzlp"] Jan 28 08:11:35 crc kubenswrapper[4776]: I0128 08:11:35.955297 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k85x2/must-gather-9pzlp" Jan 28 08:11:35 crc kubenswrapper[4776]: I0128 08:11:35.959244 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-k85x2"/"default-dockercfg-9pzmq" Jan 28 08:11:35 crc kubenswrapper[4776]: I0128 08:11:35.959448 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-k85x2"/"openshift-service-ca.crt" Jan 28 08:11:35 crc kubenswrapper[4776]: I0128 08:11:35.959734 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-k85x2"/"kube-root-ca.crt" Jan 28 08:11:35 crc kubenswrapper[4776]: I0128 08:11:35.961832 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k85x2/must-gather-9pzlp"] Jan 28 08:11:36 crc kubenswrapper[4776]: I0128 08:11:36.044704 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpwdw\" (UniqueName: \"kubernetes.io/projected/7bba7db5-580d-401d-9808-aab65fe407c1-kube-api-access-fpwdw\") pod \"must-gather-9pzlp\" (UID: \"7bba7db5-580d-401d-9808-aab65fe407c1\") " pod="openshift-must-gather-k85x2/must-gather-9pzlp" Jan 28 08:11:36 crc kubenswrapper[4776]: I0128 08:11:36.044853 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7bba7db5-580d-401d-9808-aab65fe407c1-must-gather-output\") pod \"must-gather-9pzlp\" (UID: \"7bba7db5-580d-401d-9808-aab65fe407c1\") " pod="openshift-must-gather-k85x2/must-gather-9pzlp" Jan 28 08:11:36 crc kubenswrapper[4776]: I0128 08:11:36.146488 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7bba7db5-580d-401d-9808-aab65fe407c1-must-gather-output\") pod \"must-gather-9pzlp\" (UID: \"7bba7db5-580d-401d-9808-aab65fe407c1\") " pod="openshift-must-gather-k85x2/must-gather-9pzlp" Jan 28 08:11:36 crc kubenswrapper[4776]: I0128 08:11:36.146653 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpwdw\" (UniqueName: \"kubernetes.io/projected/7bba7db5-580d-401d-9808-aab65fe407c1-kube-api-access-fpwdw\") pod \"must-gather-9pzlp\" (UID: \"7bba7db5-580d-401d-9808-aab65fe407c1\") " pod="openshift-must-gather-k85x2/must-gather-9pzlp" Jan 28 08:11:36 crc kubenswrapper[4776]: I0128 08:11:36.147086 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7bba7db5-580d-401d-9808-aab65fe407c1-must-gather-output\") pod \"must-gather-9pzlp\" (UID: \"7bba7db5-580d-401d-9808-aab65fe407c1\") " pod="openshift-must-gather-k85x2/must-gather-9pzlp" Jan 28 08:11:36 crc kubenswrapper[4776]: I0128 08:11:36.168150 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpwdw\" (UniqueName: \"kubernetes.io/projected/7bba7db5-580d-401d-9808-aab65fe407c1-kube-api-access-fpwdw\") pod \"must-gather-9pzlp\" (UID: \"7bba7db5-580d-401d-9808-aab65fe407c1\") " pod="openshift-must-gather-k85x2/must-gather-9pzlp" Jan 28 08:11:36 crc kubenswrapper[4776]: I0128 08:11:36.311284 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k85x2/must-gather-9pzlp" Jan 28 08:11:36 crc kubenswrapper[4776]: I0128 08:11:36.677788 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k85x2/must-gather-9pzlp"] Jan 28 08:11:36 crc kubenswrapper[4776]: I0128 08:11:36.729091 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k85x2/must-gather-9pzlp" event={"ID":"7bba7db5-580d-401d-9808-aab65fe407c1","Type":"ContainerStarted","Data":"d2b4b69d998e6326978455dfee248b4fd5f6d9eddc4b8c4769d0a1cf2cf2587b"} Jan 28 08:11:44 crc kubenswrapper[4776]: I0128 08:11:44.809670 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k85x2/must-gather-9pzlp" event={"ID":"7bba7db5-580d-401d-9808-aab65fe407c1","Type":"ContainerStarted","Data":"b554297c21f1caef150deb3c82d34f0a3f9dae4e3c145daad7a1c08cc44fa0b7"} Jan 28 08:11:44 crc kubenswrapper[4776]: I0128 08:11:44.810350 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k85x2/must-gather-9pzlp" event={"ID":"7bba7db5-580d-401d-9808-aab65fe407c1","Type":"ContainerStarted","Data":"bc94f5c27b83422783b5faa61dc09e4e024bff537428af9ee02a1fdd163e7cb2"} Jan 28 08:11:44 crc kubenswrapper[4776]: I0128 08:11:44.827960 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k85x2/must-gather-9pzlp" podStartSLOduration=2.364172157 podStartE2EDuration="9.827945037s" podCreationTimestamp="2026-01-28 08:11:35 +0000 UTC" firstStartedPulling="2026-01-28 08:11:36.685441969 +0000 UTC m=+4868.101102129" lastFinishedPulling="2026-01-28 08:11:44.149214849 +0000 UTC m=+4875.564875009" observedRunningTime="2026-01-28 08:11:44.826829167 +0000 UTC m=+4876.242489337" watchObservedRunningTime="2026-01-28 08:11:44.827945037 +0000 UTC m=+4876.243605197" Jan 28 08:11:48 crc kubenswrapper[4776]: I0128 08:11:48.063767 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k85x2/crc-debug-wp5ns"] Jan 28 08:11:48 crc kubenswrapper[4776]: I0128 08:11:48.067211 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k85x2/crc-debug-wp5ns" Jan 28 08:11:48 crc kubenswrapper[4776]: I0128 08:11:48.208145 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qjcf\" (UniqueName: \"kubernetes.io/projected/a16ec748-a55c-4903-a3af-92b00886f47c-kube-api-access-4qjcf\") pod \"crc-debug-wp5ns\" (UID: \"a16ec748-a55c-4903-a3af-92b00886f47c\") " pod="openshift-must-gather-k85x2/crc-debug-wp5ns" Jan 28 08:11:48 crc kubenswrapper[4776]: I0128 08:11:48.208602 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a16ec748-a55c-4903-a3af-92b00886f47c-host\") pod \"crc-debug-wp5ns\" (UID: \"a16ec748-a55c-4903-a3af-92b00886f47c\") " pod="openshift-must-gather-k85x2/crc-debug-wp5ns" Jan 28 08:11:48 crc kubenswrapper[4776]: I0128 08:11:48.310080 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qjcf\" (UniqueName: \"kubernetes.io/projected/a16ec748-a55c-4903-a3af-92b00886f47c-kube-api-access-4qjcf\") pod \"crc-debug-wp5ns\" (UID: \"a16ec748-a55c-4903-a3af-92b00886f47c\") " pod="openshift-must-gather-k85x2/crc-debug-wp5ns" Jan 28 08:11:48 crc kubenswrapper[4776]: I0128 08:11:48.310448 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a16ec748-a55c-4903-a3af-92b00886f47c-host\") pod \"crc-debug-wp5ns\" (UID: \"a16ec748-a55c-4903-a3af-92b00886f47c\") " pod="openshift-must-gather-k85x2/crc-debug-wp5ns" Jan 28 08:11:48 crc kubenswrapper[4776]: I0128 08:11:48.310616 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a16ec748-a55c-4903-a3af-92b00886f47c-host\") pod \"crc-debug-wp5ns\" (UID: \"a16ec748-a55c-4903-a3af-92b00886f47c\") " pod="openshift-must-gather-k85x2/crc-debug-wp5ns" Jan 28 08:11:48 crc kubenswrapper[4776]: I0128 08:11:48.330764 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qjcf\" (UniqueName: \"kubernetes.io/projected/a16ec748-a55c-4903-a3af-92b00886f47c-kube-api-access-4qjcf\") pod \"crc-debug-wp5ns\" (UID: \"a16ec748-a55c-4903-a3af-92b00886f47c\") " pod="openshift-must-gather-k85x2/crc-debug-wp5ns" Jan 28 08:11:48 crc kubenswrapper[4776]: I0128 08:11:48.400259 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k85x2/crc-debug-wp5ns" Jan 28 08:11:48 crc kubenswrapper[4776]: W0128 08:11:48.446671 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda16ec748_a55c_4903_a3af_92b00886f47c.slice/crio-57066107d23d97331dbccb390e756f50341e1b9b3e7fe84c34629cc5e176c765 WatchSource:0}: Error finding container 57066107d23d97331dbccb390e756f50341e1b9b3e7fe84c34629cc5e176c765: Status 404 returned error can't find the container with id 57066107d23d97331dbccb390e756f50341e1b9b3e7fe84c34629cc5e176c765 Jan 28 08:11:48 crc kubenswrapper[4776]: I0128 08:11:48.848032 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k85x2/crc-debug-wp5ns" event={"ID":"a16ec748-a55c-4903-a3af-92b00886f47c","Type":"ContainerStarted","Data":"57066107d23d97331dbccb390e756f50341e1b9b3e7fe84c34629cc5e176c765"} Jan 28 08:11:58 crc kubenswrapper[4776]: I0128 08:11:58.953012 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k85x2/crc-debug-wp5ns" event={"ID":"a16ec748-a55c-4903-a3af-92b00886f47c","Type":"ContainerStarted","Data":"99bcd5f5bac1d7d7140008d96d8255040ea0902bb12c4e7433499d7a1e4a6c2b"} Jan 28 08:11:58 crc kubenswrapper[4776]: I0128 08:11:58.977680 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k85x2/crc-debug-wp5ns" podStartSLOduration=0.694756658 podStartE2EDuration="10.977661735s" podCreationTimestamp="2026-01-28 08:11:48 +0000 UTC" firstStartedPulling="2026-01-28 08:11:48.44900492 +0000 UTC m=+4879.864665080" lastFinishedPulling="2026-01-28 08:11:58.731909997 +0000 UTC m=+4890.147570157" observedRunningTime="2026-01-28 08:11:58.973347418 +0000 UTC m=+4890.389007588" watchObservedRunningTime="2026-01-28 08:11:58.977661735 +0000 UTC m=+4890.393321885" Jan 28 08:12:48 crc kubenswrapper[4776]: I0128 08:12:48.384660 4776 generic.go:334] "Generic (PLEG): container finished" podID="a16ec748-a55c-4903-a3af-92b00886f47c" containerID="99bcd5f5bac1d7d7140008d96d8255040ea0902bb12c4e7433499d7a1e4a6c2b" exitCode=0 Jan 28 08:12:48 crc kubenswrapper[4776]: I0128 08:12:48.384857 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k85x2/crc-debug-wp5ns" event={"ID":"a16ec748-a55c-4903-a3af-92b00886f47c","Type":"ContainerDied","Data":"99bcd5f5bac1d7d7140008d96d8255040ea0902bb12c4e7433499d7a1e4a6c2b"} Jan 28 08:12:49 crc kubenswrapper[4776]: I0128 08:12:49.532574 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k85x2/crc-debug-wp5ns" Jan 28 08:12:49 crc kubenswrapper[4776]: I0128 08:12:49.583209 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k85x2/crc-debug-wp5ns"] Jan 28 08:12:49 crc kubenswrapper[4776]: I0128 08:12:49.598972 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k85x2/crc-debug-wp5ns"] Jan 28 08:12:49 crc kubenswrapper[4776]: I0128 08:12:49.672426 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a16ec748-a55c-4903-a3af-92b00886f47c-host\") pod \"a16ec748-a55c-4903-a3af-92b00886f47c\" (UID: \"a16ec748-a55c-4903-a3af-92b00886f47c\") " Jan 28 08:12:49 crc kubenswrapper[4776]: I0128 08:12:49.672627 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qjcf\" (UniqueName: \"kubernetes.io/projected/a16ec748-a55c-4903-a3af-92b00886f47c-kube-api-access-4qjcf\") pod \"a16ec748-a55c-4903-a3af-92b00886f47c\" (UID: \"a16ec748-a55c-4903-a3af-92b00886f47c\") " Jan 28 08:12:49 crc kubenswrapper[4776]: I0128 08:12:49.672634 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a16ec748-a55c-4903-a3af-92b00886f47c-host" (OuterVolumeSpecName: "host") pod "a16ec748-a55c-4903-a3af-92b00886f47c" (UID: "a16ec748-a55c-4903-a3af-92b00886f47c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 08:12:49 crc kubenswrapper[4776]: I0128 08:12:49.673158 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a16ec748-a55c-4903-a3af-92b00886f47c-host\") on node \"crc\" DevicePath \"\"" Jan 28 08:12:49 crc kubenswrapper[4776]: I0128 08:12:49.680614 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a16ec748-a55c-4903-a3af-92b00886f47c-kube-api-access-4qjcf" (OuterVolumeSpecName: "kube-api-access-4qjcf") pod "a16ec748-a55c-4903-a3af-92b00886f47c" (UID: "a16ec748-a55c-4903-a3af-92b00886f47c"). InnerVolumeSpecName "kube-api-access-4qjcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:12:49 crc kubenswrapper[4776]: I0128 08:12:49.775539 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qjcf\" (UniqueName: \"kubernetes.io/projected/a16ec748-a55c-4903-a3af-92b00886f47c-kube-api-access-4qjcf\") on node \"crc\" DevicePath \"\"" Jan 28 08:12:50 crc kubenswrapper[4776]: I0128 08:12:50.405538 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57066107d23d97331dbccb390e756f50341e1b9b3e7fe84c34629cc5e176c765" Jan 28 08:12:50 crc kubenswrapper[4776]: I0128 08:12:50.405957 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k85x2/crc-debug-wp5ns" Jan 28 08:12:50 crc kubenswrapper[4776]: I0128 08:12:50.768510 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k85x2/crc-debug-qntmd"] Jan 28 08:12:50 crc kubenswrapper[4776]: E0128 08:12:50.769099 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16ec748-a55c-4903-a3af-92b00886f47c" containerName="container-00" Jan 28 08:12:50 crc kubenswrapper[4776]: I0128 08:12:50.769121 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16ec748-a55c-4903-a3af-92b00886f47c" containerName="container-00" Jan 28 08:12:50 crc kubenswrapper[4776]: I0128 08:12:50.769470 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a16ec748-a55c-4903-a3af-92b00886f47c" containerName="container-00" Jan 28 08:12:50 crc kubenswrapper[4776]: I0128 08:12:50.770431 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k85x2/crc-debug-qntmd" Jan 28 08:12:50 crc kubenswrapper[4776]: I0128 08:12:50.904429 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/306b9f1d-99f0-4857-a5d9-cf68808291da-host\") pod \"crc-debug-qntmd\" (UID: \"306b9f1d-99f0-4857-a5d9-cf68808291da\") " pod="openshift-must-gather-k85x2/crc-debug-qntmd" Jan 28 08:12:50 crc kubenswrapper[4776]: I0128 08:12:50.904980 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv6g9\" (UniqueName: \"kubernetes.io/projected/306b9f1d-99f0-4857-a5d9-cf68808291da-kube-api-access-tv6g9\") pod \"crc-debug-qntmd\" (UID: \"306b9f1d-99f0-4857-a5d9-cf68808291da\") " pod="openshift-must-gather-k85x2/crc-debug-qntmd" Jan 28 08:12:51 crc kubenswrapper[4776]: I0128 08:12:51.007386 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv6g9\" (UniqueName: \"kubernetes.io/projected/306b9f1d-99f0-4857-a5d9-cf68808291da-kube-api-access-tv6g9\") pod \"crc-debug-qntmd\" (UID: \"306b9f1d-99f0-4857-a5d9-cf68808291da\") " pod="openshift-must-gather-k85x2/crc-debug-qntmd" Jan 28 08:12:51 crc kubenswrapper[4776]: I0128 08:12:51.007523 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/306b9f1d-99f0-4857-a5d9-cf68808291da-host\") pod \"crc-debug-qntmd\" (UID: \"306b9f1d-99f0-4857-a5d9-cf68808291da\") " pod="openshift-must-gather-k85x2/crc-debug-qntmd" Jan 28 08:12:51 crc kubenswrapper[4776]: I0128 08:12:51.007733 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/306b9f1d-99f0-4857-a5d9-cf68808291da-host\") pod \"crc-debug-qntmd\" (UID: \"306b9f1d-99f0-4857-a5d9-cf68808291da\") " pod="openshift-must-gather-k85x2/crc-debug-qntmd" Jan 28 08:12:51 crc kubenswrapper[4776]: I0128 08:12:51.180222 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv6g9\" (UniqueName: \"kubernetes.io/projected/306b9f1d-99f0-4857-a5d9-cf68808291da-kube-api-access-tv6g9\") pod \"crc-debug-qntmd\" (UID: \"306b9f1d-99f0-4857-a5d9-cf68808291da\") " pod="openshift-must-gather-k85x2/crc-debug-qntmd" Jan 28 08:12:51 crc kubenswrapper[4776]: I0128 08:12:51.335339 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a16ec748-a55c-4903-a3af-92b00886f47c" path="/var/lib/kubelet/pods/a16ec748-a55c-4903-a3af-92b00886f47c/volumes" Jan 28 08:12:51 crc kubenswrapper[4776]: I0128 08:12:51.402834 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k85x2/crc-debug-qntmd" Jan 28 08:12:52 crc kubenswrapper[4776]: I0128 08:12:52.462013 4776 generic.go:334] "Generic (PLEG): container finished" podID="306b9f1d-99f0-4857-a5d9-cf68808291da" containerID="c95af90b791a91ab1ac8cdf9d1c10564327e9481cd1bea29bae93e307137ca0b" exitCode=0 Jan 28 08:12:52 crc kubenswrapper[4776]: I0128 08:12:52.462108 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k85x2/crc-debug-qntmd" event={"ID":"306b9f1d-99f0-4857-a5d9-cf68808291da","Type":"ContainerDied","Data":"c95af90b791a91ab1ac8cdf9d1c10564327e9481cd1bea29bae93e307137ca0b"} Jan 28 08:12:52 crc kubenswrapper[4776]: I0128 08:12:52.462694 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k85x2/crc-debug-qntmd" event={"ID":"306b9f1d-99f0-4857-a5d9-cf68808291da","Type":"ContainerStarted","Data":"6e6758eb76eae65be17d46618a34f13a091ea050e84dcef4fc02b092fa5a6028"} Jan 28 08:12:53 crc kubenswrapper[4776]: I0128 08:12:53.615448 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k85x2/crc-debug-qntmd" Jan 28 08:12:53 crc kubenswrapper[4776]: I0128 08:12:53.764365 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/306b9f1d-99f0-4857-a5d9-cf68808291da-host\") pod \"306b9f1d-99f0-4857-a5d9-cf68808291da\" (UID: \"306b9f1d-99f0-4857-a5d9-cf68808291da\") " Jan 28 08:12:53 crc kubenswrapper[4776]: I0128 08:12:53.764464 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv6g9\" (UniqueName: \"kubernetes.io/projected/306b9f1d-99f0-4857-a5d9-cf68808291da-kube-api-access-tv6g9\") pod \"306b9f1d-99f0-4857-a5d9-cf68808291da\" (UID: \"306b9f1d-99f0-4857-a5d9-cf68808291da\") " Jan 28 08:12:53 crc kubenswrapper[4776]: I0128 08:12:53.764477 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/306b9f1d-99f0-4857-a5d9-cf68808291da-host" (OuterVolumeSpecName: "host") pod "306b9f1d-99f0-4857-a5d9-cf68808291da" (UID: "306b9f1d-99f0-4857-a5d9-cf68808291da"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 08:12:53 crc kubenswrapper[4776]: I0128 08:12:53.764904 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/306b9f1d-99f0-4857-a5d9-cf68808291da-host\") on node \"crc\" DevicePath \"\"" Jan 28 08:12:53 crc kubenswrapper[4776]: I0128 08:12:53.769985 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/306b9f1d-99f0-4857-a5d9-cf68808291da-kube-api-access-tv6g9" (OuterVolumeSpecName: "kube-api-access-tv6g9") pod "306b9f1d-99f0-4857-a5d9-cf68808291da" (UID: "306b9f1d-99f0-4857-a5d9-cf68808291da"). InnerVolumeSpecName "kube-api-access-tv6g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:12:53 crc kubenswrapper[4776]: I0128 08:12:53.866287 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv6g9\" (UniqueName: \"kubernetes.io/projected/306b9f1d-99f0-4857-a5d9-cf68808291da-kube-api-access-tv6g9\") on node \"crc\" DevicePath \"\"" Jan 28 08:12:54 crc kubenswrapper[4776]: I0128 08:12:54.484063 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k85x2/crc-debug-qntmd" event={"ID":"306b9f1d-99f0-4857-a5d9-cf68808291da","Type":"ContainerDied","Data":"6e6758eb76eae65be17d46618a34f13a091ea050e84dcef4fc02b092fa5a6028"} Jan 28 08:12:54 crc kubenswrapper[4776]: I0128 08:12:54.484789 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e6758eb76eae65be17d46618a34f13a091ea050e84dcef4fc02b092fa5a6028" Jan 28 08:12:54 crc kubenswrapper[4776]: I0128 08:12:54.484717 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k85x2/crc-debug-qntmd" Jan 28 08:12:54 crc kubenswrapper[4776]: I0128 08:12:54.698986 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k85x2/crc-debug-qntmd"] Jan 28 08:12:54 crc kubenswrapper[4776]: I0128 08:12:54.731932 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k85x2/crc-debug-qntmd"] Jan 28 08:12:55 crc kubenswrapper[4776]: I0128 08:12:55.340046 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="306b9f1d-99f0-4857-a5d9-cf68808291da" path="/var/lib/kubelet/pods/306b9f1d-99f0-4857-a5d9-cf68808291da/volumes" Jan 28 08:12:55 crc kubenswrapper[4776]: I0128 08:12:55.962781 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k85x2/crc-debug-jv9hs"] Jan 28 08:12:55 crc kubenswrapper[4776]: E0128 08:12:55.963397 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306b9f1d-99f0-4857-a5d9-cf68808291da" containerName="container-00" Jan 28 08:12:55 crc kubenswrapper[4776]: I0128 08:12:55.963408 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="306b9f1d-99f0-4857-a5d9-cf68808291da" containerName="container-00" Jan 28 08:12:55 crc kubenswrapper[4776]: I0128 08:12:55.963630 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="306b9f1d-99f0-4857-a5d9-cf68808291da" containerName="container-00" Jan 28 08:12:55 crc kubenswrapper[4776]: I0128 08:12:55.964282 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k85x2/crc-debug-jv9hs" Jan 28 08:12:56 crc kubenswrapper[4776]: I0128 08:12:56.110279 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wcv6\" (UniqueName: \"kubernetes.io/projected/ca1ee538-719b-431b-93da-77b1de400472-kube-api-access-5wcv6\") pod \"crc-debug-jv9hs\" (UID: \"ca1ee538-719b-431b-93da-77b1de400472\") " pod="openshift-must-gather-k85x2/crc-debug-jv9hs" Jan 28 08:12:56 crc kubenswrapper[4776]: I0128 08:12:56.110648 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ca1ee538-719b-431b-93da-77b1de400472-host\") pod \"crc-debug-jv9hs\" (UID: \"ca1ee538-719b-431b-93da-77b1de400472\") " pod="openshift-must-gather-k85x2/crc-debug-jv9hs" Jan 28 08:12:56 crc kubenswrapper[4776]: I0128 08:12:56.213206 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ca1ee538-719b-431b-93da-77b1de400472-host\") pod \"crc-debug-jv9hs\" (UID: \"ca1ee538-719b-431b-93da-77b1de400472\") " pod="openshift-must-gather-k85x2/crc-debug-jv9hs" Jan 28 08:12:56 crc kubenswrapper[4776]: I0128 08:12:56.213280 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wcv6\" (UniqueName: \"kubernetes.io/projected/ca1ee538-719b-431b-93da-77b1de400472-kube-api-access-5wcv6\") pod \"crc-debug-jv9hs\" (UID: \"ca1ee538-719b-431b-93da-77b1de400472\") " pod="openshift-must-gather-k85x2/crc-debug-jv9hs" Jan 28 08:12:56 crc kubenswrapper[4776]: I0128 08:12:56.213331 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ca1ee538-719b-431b-93da-77b1de400472-host\") pod \"crc-debug-jv9hs\" (UID: \"ca1ee538-719b-431b-93da-77b1de400472\") " pod="openshift-must-gather-k85x2/crc-debug-jv9hs" Jan 28 08:12:56 crc kubenswrapper[4776]: I0128 08:12:56.237458 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wcv6\" (UniqueName: \"kubernetes.io/projected/ca1ee538-719b-431b-93da-77b1de400472-kube-api-access-5wcv6\") pod \"crc-debug-jv9hs\" (UID: \"ca1ee538-719b-431b-93da-77b1de400472\") " pod="openshift-must-gather-k85x2/crc-debug-jv9hs" Jan 28 08:12:56 crc kubenswrapper[4776]: I0128 08:12:56.279985 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k85x2/crc-debug-jv9hs" Jan 28 08:12:56 crc kubenswrapper[4776]: W0128 08:12:56.309875 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca1ee538_719b_431b_93da_77b1de400472.slice/crio-69f9981f4c01755ef31eec990af187aea60ad15b014c54385715477937f76c37 WatchSource:0}: Error finding container 69f9981f4c01755ef31eec990af187aea60ad15b014c54385715477937f76c37: Status 404 returned error can't find the container with id 69f9981f4c01755ef31eec990af187aea60ad15b014c54385715477937f76c37 Jan 28 08:12:56 crc kubenswrapper[4776]: I0128 08:12:56.504911 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k85x2/crc-debug-jv9hs" event={"ID":"ca1ee538-719b-431b-93da-77b1de400472","Type":"ContainerStarted","Data":"69f9981f4c01755ef31eec990af187aea60ad15b014c54385715477937f76c37"} Jan 28 08:12:57 crc kubenswrapper[4776]: I0128 08:12:57.518628 4776 generic.go:334] "Generic (PLEG): container finished" podID="ca1ee538-719b-431b-93da-77b1de400472" containerID="3d0c8937e81af0c96cf7525157fc85720eb94757b6ac2e1180ade7005a59659b" exitCode=0 Jan 28 08:12:57 crc kubenswrapper[4776]: I0128 08:12:57.518736 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k85x2/crc-debug-jv9hs" event={"ID":"ca1ee538-719b-431b-93da-77b1de400472","Type":"ContainerDied","Data":"3d0c8937e81af0c96cf7525157fc85720eb94757b6ac2e1180ade7005a59659b"} Jan 28 08:12:57 crc kubenswrapper[4776]: I0128 08:12:57.571483 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k85x2/crc-debug-jv9hs"] Jan 28 08:12:57 crc kubenswrapper[4776]: I0128 08:12:57.582481 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k85x2/crc-debug-jv9hs"] Jan 28 08:12:58 crc kubenswrapper[4776]: I0128 08:12:58.624874 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k85x2/crc-debug-jv9hs" Jan 28 08:12:58 crc kubenswrapper[4776]: I0128 08:12:58.760510 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ca1ee538-719b-431b-93da-77b1de400472-host\") pod \"ca1ee538-719b-431b-93da-77b1de400472\" (UID: \"ca1ee538-719b-431b-93da-77b1de400472\") " Jan 28 08:12:58 crc kubenswrapper[4776]: I0128 08:12:58.760662 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca1ee538-719b-431b-93da-77b1de400472-host" (OuterVolumeSpecName: "host") pod "ca1ee538-719b-431b-93da-77b1de400472" (UID: "ca1ee538-719b-431b-93da-77b1de400472"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 08:12:58 crc kubenswrapper[4776]: I0128 08:12:58.760884 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wcv6\" (UniqueName: \"kubernetes.io/projected/ca1ee538-719b-431b-93da-77b1de400472-kube-api-access-5wcv6\") pod \"ca1ee538-719b-431b-93da-77b1de400472\" (UID: \"ca1ee538-719b-431b-93da-77b1de400472\") " Jan 28 08:12:58 crc kubenswrapper[4776]: I0128 08:12:58.761392 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ca1ee538-719b-431b-93da-77b1de400472-host\") on node \"crc\" DevicePath \"\"" Jan 28 08:12:58 crc kubenswrapper[4776]: I0128 08:12:58.765731 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca1ee538-719b-431b-93da-77b1de400472-kube-api-access-5wcv6" (OuterVolumeSpecName: "kube-api-access-5wcv6") pod "ca1ee538-719b-431b-93da-77b1de400472" (UID: "ca1ee538-719b-431b-93da-77b1de400472"). InnerVolumeSpecName "kube-api-access-5wcv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:12:58 crc kubenswrapper[4776]: I0128 08:12:58.863702 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wcv6\" (UniqueName: \"kubernetes.io/projected/ca1ee538-719b-431b-93da-77b1de400472-kube-api-access-5wcv6\") on node \"crc\" DevicePath \"\"" Jan 28 08:12:59 crc kubenswrapper[4776]: I0128 08:12:59.317668 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca1ee538-719b-431b-93da-77b1de400472" path="/var/lib/kubelet/pods/ca1ee538-719b-431b-93da-77b1de400472/volumes" Jan 28 08:12:59 crc kubenswrapper[4776]: I0128 08:12:59.536142 4776 scope.go:117] "RemoveContainer" containerID="3d0c8937e81af0c96cf7525157fc85720eb94757b6ac2e1180ade7005a59659b" Jan 28 08:12:59 crc kubenswrapper[4776]: I0128 08:12:59.536180 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k85x2/crc-debug-jv9hs" Jan 28 08:13:28 crc kubenswrapper[4776]: I0128 08:13:28.203812 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55df755858-s7sbs_52586b79-6cf4-475f-852d-aa3c903b5b38/barbican-api/0.log" Jan 28 08:13:28 crc kubenswrapper[4776]: I0128 08:13:28.368886 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55df755858-s7sbs_52586b79-6cf4-475f-852d-aa3c903b5b38/barbican-api-log/0.log" Jan 28 08:13:28 crc kubenswrapper[4776]: I0128 08:13:28.455181 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f446b9874-8rzlp_3092241e-a9e3-4c51-b31b-36eae29a52e1/barbican-keystone-listener/0.log" Jan 28 08:13:28 crc kubenswrapper[4776]: I0128 08:13:28.507692 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f446b9874-8rzlp_3092241e-a9e3-4c51-b31b-36eae29a52e1/barbican-keystone-listener-log/0.log" Jan 28 08:13:28 crc kubenswrapper[4776]: I0128 08:13:28.622236 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5cc6b97df7-r6gfn_87270d72-c59e-4526-b69b-ceaebfb13fdd/barbican-worker/0.log" Jan 28 08:13:28 crc kubenswrapper[4776]: I0128 08:13:28.669355 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5cc6b97df7-r6gfn_87270d72-c59e-4526-b69b-ceaebfb13fdd/barbican-worker-log/0.log" Jan 28 08:13:28 crc kubenswrapper[4776]: I0128 08:13:28.830990 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb_9a3467e9-b4e8-40f9-8e96-3615aa7248ca/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:13:28 crc kubenswrapper[4776]: I0128 08:13:28.909208 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_259f2bd2-3855-4ebb-8eeb-1457a26c74ae/ceilometer-central-agent/0.log" Jan 28 08:13:28 crc kubenswrapper[4776]: I0128 08:13:28.940385 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_259f2bd2-3855-4ebb-8eeb-1457a26c74ae/ceilometer-notification-agent/0.log" Jan 28 08:13:29 crc kubenswrapper[4776]: I0128 08:13:29.033165 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_259f2bd2-3855-4ebb-8eeb-1457a26c74ae/proxy-httpd/0.log" Jan 28 08:13:29 crc kubenswrapper[4776]: I0128 08:13:29.055944 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_259f2bd2-3855-4ebb-8eeb-1457a26c74ae/sg-core/0.log" Jan 28 08:13:29 crc kubenswrapper[4776]: I0128 08:13:29.191418 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ce856766-46b4-4498-9aa0-bdf8c0e946db/cinder-api/0.log" Jan 28 08:13:29 crc kubenswrapper[4776]: I0128 08:13:29.248650 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ce856766-46b4-4498-9aa0-bdf8c0e946db/cinder-api-log/0.log" Jan 28 08:13:29 crc kubenswrapper[4776]: I0128 08:13:29.359511 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b39df015-b6fa-40eb-b270-21f01f3cb141/cinder-scheduler/0.log" Jan 28 08:13:29 crc kubenswrapper[4776]: I0128 08:13:29.458086 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b39df015-b6fa-40eb-b270-21f01f3cb141/probe/0.log" Jan 28 08:13:29 crc kubenswrapper[4776]: I0128 08:13:29.532360 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-w925h_aafcd74c-ce06-4b5f-a858-ed32676f7503/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:13:29 crc kubenswrapper[4776]: I0128 08:13:29.680289 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt_e525b968-aa0e-4d5a-9fe4-063ce4fdb686/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:13:29 crc kubenswrapper[4776]: I0128 08:13:29.800491 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f4d4c4b7-lt79s_d235d829-cf03-466a-a77d-27bf20dc03a0/init/0.log" Jan 28 08:13:29 crc kubenswrapper[4776]: I0128 08:13:29.925706 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f4d4c4b7-lt79s_d235d829-cf03-466a-a77d-27bf20dc03a0/init/0.log" Jan 28 08:13:30 crc kubenswrapper[4776]: I0128 08:13:30.096929 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp_ea52630b-ebcc-41d5-9265-eec1e8ae437d/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:13:30 crc kubenswrapper[4776]: I0128 08:13:30.134954 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f4d4c4b7-lt79s_d235d829-cf03-466a-a77d-27bf20dc03a0/dnsmasq-dns/0.log" Jan 28 08:13:30 crc kubenswrapper[4776]: I0128 08:13:30.267824 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_78f1d6b5-48e7-4ad3-8066-acb3faf83f73/glance-httpd/0.log" Jan 28 08:13:30 crc kubenswrapper[4776]: I0128 08:13:30.314036 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_78f1d6b5-48e7-4ad3-8066-acb3faf83f73/glance-log/0.log" Jan 28 08:13:30 crc kubenswrapper[4776]: I0128 08:13:30.444897 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_38e0faa6-840c-4e44-ad4a-16d42f83e194/glance-httpd/0.log" Jan 28 08:13:30 crc kubenswrapper[4776]: I0128 08:13:30.447003 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_38e0faa6-840c-4e44-ad4a-16d42f83e194/glance-log/0.log" Jan 28 08:13:30 crc kubenswrapper[4776]: I0128 08:13:30.645112 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-755fdfc784-krn2x_dc39478f-fee2-4eb1-89bc-789b5179a1ca/horizon/0.log" Jan 28 08:13:30 crc kubenswrapper[4776]: I0128 08:13:30.826333 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw_f05748ac-8e6e-4713-ae86-b0e4ffadec84/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:13:30 crc kubenswrapper[4776]: I0128 08:13:30.959951 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-688gl_a00fc8a0-f777-496b-80d1-2c6e116d6e00/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:13:31 crc kubenswrapper[4776]: I0128 08:13:31.234751 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-755fdfc784-krn2x_dc39478f-fee2-4eb1-89bc-789b5179a1ca/horizon-log/0.log" Jan 28 08:13:31 crc kubenswrapper[4776]: I0128 08:13:31.406784 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-979f97b77-x2lng_9d7ed0f7-4d79-42b7-8f0d-805e6994e958/keystone-api/0.log" Jan 28 08:13:31 crc kubenswrapper[4776]: I0128 08:13:31.532644 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29493121-7fxmq_bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d/keystone-cron/0.log" Jan 28 08:13:31 crc kubenswrapper[4776]: I0128 08:13:31.618342 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_c15b0ff9-2ff0-4eed-821d-ba0da8122d6d/kube-state-metrics/0.log" Jan 28 08:13:31 crc kubenswrapper[4776]: I0128 08:13:31.704443 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-288pn_5e450505-d924-4be0-8491-92297f012e24/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:13:32 crc kubenswrapper[4776]: I0128 08:13:32.107433 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg_36edfabc-d31a-4c3f-98d0-3c830a282c65/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:13:32 crc kubenswrapper[4776]: I0128 08:13:32.125570 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7967c58c5f-jkrnc_502a66df-cc30-46b3-98d4-d056d3497547/neutron-httpd/0.log" Jan 28 08:13:32 crc kubenswrapper[4776]: I0128 08:13:32.231762 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7967c58c5f-jkrnc_502a66df-cc30-46b3-98d4-d056d3497547/neutron-api/0.log" Jan 28 08:13:32 crc kubenswrapper[4776]: I0128 08:13:32.717132 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_091b6025-b6c8-4c2b-81d7-7b25aeaef620/nova-cell0-conductor-conductor/0.log" Jan 28 08:13:33 crc kubenswrapper[4776]: I0128 08:13:33.100362 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_10f77e4a-2026-45c0-80c8-a5d8b18046df/nova-cell1-conductor-conductor/0.log" Jan 28 08:13:33 crc kubenswrapper[4776]: I0128 08:13:33.242979 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5a1237de-1bcc-4b1b-bff5-a775162f3ed9/nova-cell1-novncproxy-novncproxy/0.log" Jan 28 08:13:33 crc kubenswrapper[4776]: I0128 08:13:33.335187 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9ca740aa-b1f4-4878-93f4-116c2c17ff53/nova-api-log/0.log" Jan 28 08:13:33 crc kubenswrapper[4776]: I0128 08:13:33.529590 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-gtkx6_5f807cd7-856d-4fd5-afe4-963a4a77a5bf/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:13:33 crc kubenswrapper[4776]: I0128 08:13:33.607485 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_31eb87d0-ab51-4738-8205-b515b8b57cf1/nova-metadata-log/0.log" Jan 28 08:13:33 crc kubenswrapper[4776]: I0128 08:13:33.658240 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9ca740aa-b1f4-4878-93f4-116c2c17ff53/nova-api-api/0.log" Jan 28 08:13:34 crc kubenswrapper[4776]: I0128 08:13:34.037847 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b485d028-58ae-46ec-afd9-720d1a05bade/mysql-bootstrap/0.log" Jan 28 08:13:34 crc kubenswrapper[4776]: I0128 08:13:34.140045 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_7e68a0f9-2ffb-43a1-8945-37b6b68b2d43/nova-scheduler-scheduler/0.log" Jan 28 08:13:34 crc kubenswrapper[4776]: I0128 08:13:34.228847 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b485d028-58ae-46ec-afd9-720d1a05bade/mysql-bootstrap/0.log" Jan 28 08:13:34 crc kubenswrapper[4776]: I0128 08:13:34.248426 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b485d028-58ae-46ec-afd9-720d1a05bade/galera/0.log" Jan 28 08:13:34 crc kubenswrapper[4776]: I0128 08:13:34.899220 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e46126d4-c96e-4d66-9a2e-7f6873a6a1dd/mysql-bootstrap/0.log" Jan 28 08:13:35 crc kubenswrapper[4776]: I0128 08:13:35.074511 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e46126d4-c96e-4d66-9a2e-7f6873a6a1dd/mysql-bootstrap/0.log" Jan 28 08:13:35 crc kubenswrapper[4776]: I0128 08:13:35.124326 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e46126d4-c96e-4d66-9a2e-7f6873a6a1dd/galera/0.log" Jan 28 08:13:35 crc kubenswrapper[4776]: I0128 08:13:35.306137 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_abb12d44-fb9e-4ac4-95ad-a82606ff0709/openstackclient/0.log" Jan 28 08:13:35 crc kubenswrapper[4776]: I0128 08:13:35.352894 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-lrzjl_4d2cb31b-ab97-4714-9978-225821819328/ovn-controller/0.log" Jan 28 08:13:35 crc kubenswrapper[4776]: I0128 08:13:35.566076 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-59r62_a390a227-8301-4ed3-80ee-06131089f499/openstack-network-exporter/0.log" Jan 28 08:13:35 crc kubenswrapper[4776]: I0128 08:13:35.650275 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_31eb87d0-ab51-4738-8205-b515b8b57cf1/nova-metadata-metadata/0.log" Jan 28 08:13:35 crc kubenswrapper[4776]: I0128 08:13:35.709584 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mbldl_7e98ecc8-0f85-413e-9b5a-4fe838eb9925/ovsdb-server-init/0.log" Jan 28 08:13:35 crc kubenswrapper[4776]: I0128 08:13:35.921846 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mbldl_7e98ecc8-0f85-413e-9b5a-4fe838eb9925/ovs-vswitchd/0.log" Jan 28 08:13:35 crc kubenswrapper[4776]: I0128 08:13:35.930641 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mbldl_7e98ecc8-0f85-413e-9b5a-4fe838eb9925/ovsdb-server-init/0.log" Jan 28 08:13:35 crc kubenswrapper[4776]: I0128 08:13:35.984287 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mbldl_7e98ecc8-0f85-413e-9b5a-4fe838eb9925/ovsdb-server/0.log" Jan 28 08:13:36 crc kubenswrapper[4776]: I0128 08:13:36.143165 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-kpxh8_34505caa-b76e-404f-b71a-a863e549d905/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:13:36 crc kubenswrapper[4776]: I0128 08:13:36.757627 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_aff78447-a04f-4c5b-871f-3b47df7325c8/openstack-network-exporter/0.log" Jan 28 08:13:36 crc kubenswrapper[4776]: I0128 08:13:36.857120 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_aff78447-a04f-4c5b-871f-3b47df7325c8/ovn-northd/0.log" Jan 28 08:13:36 crc kubenswrapper[4776]: I0128 08:13:36.934735 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fd4193d3-abf1-457c-a774-de938b12b909/openstack-network-exporter/0.log" Jan 28 08:13:37 crc kubenswrapper[4776]: I0128 08:13:37.051418 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fd4193d3-abf1-457c-a774-de938b12b909/ovsdbserver-nb/0.log" Jan 28 08:13:37 crc kubenswrapper[4776]: I0128 08:13:37.130438 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e356c55c-adea-433d-9f03-a403f330b085/openstack-network-exporter/0.log" Jan 28 08:13:37 crc kubenswrapper[4776]: I0128 08:13:37.177750 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e356c55c-adea-433d-9f03-a403f330b085/ovsdbserver-sb/0.log" Jan 28 08:13:37 crc kubenswrapper[4776]: I0128 08:13:37.496514 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-65b6d456f6-wvlnv_388a79ff-9e00-4cc1-a935-20a9b00402a8/placement-api/0.log" Jan 28 08:13:37 crc kubenswrapper[4776]: I0128 08:13:37.560537 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5a41440-d466-4d04-adb9-13760bb7977a/init-config-reloader/0.log" Jan 28 08:13:37 crc kubenswrapper[4776]: I0128 08:13:37.600684 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-65b6d456f6-wvlnv_388a79ff-9e00-4cc1-a935-20a9b00402a8/placement-log/0.log" Jan 28 08:13:37 crc kubenswrapper[4776]: I0128 08:13:37.685322 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5a41440-d466-4d04-adb9-13760bb7977a/init-config-reloader/0.log" Jan 28 08:13:37 crc kubenswrapper[4776]: I0128 08:13:37.746295 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5a41440-d466-4d04-adb9-13760bb7977a/prometheus/0.log" Jan 28 08:13:37 crc kubenswrapper[4776]: I0128 08:13:37.746984 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5a41440-d466-4d04-adb9-13760bb7977a/config-reloader/0.log" Jan 28 08:13:37 crc kubenswrapper[4776]: I0128 08:13:37.830055 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5a41440-d466-4d04-adb9-13760bb7977a/thanos-sidecar/0.log" Jan 28 08:13:37 crc kubenswrapper[4776]: I0128 08:13:37.942132 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1/setup-container/0.log" Jan 28 08:13:38 crc kubenswrapper[4776]: I0128 08:13:38.171476 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1/rabbitmq/0.log" Jan 28 08:13:38 crc kubenswrapper[4776]: I0128 08:13:38.201832 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1/setup-container/0.log" Jan 28 08:13:38 crc kubenswrapper[4776]: I0128 08:13:38.243122 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4cfd0885-0776-471c-b8f4-afb359e460b2/setup-container/0.log" Jan 28 08:13:38 crc kubenswrapper[4776]: I0128 08:13:38.427176 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4cfd0885-0776-471c-b8f4-afb359e460b2/setup-container/0.log" Jan 28 08:13:38 crc kubenswrapper[4776]: I0128 08:13:38.476695 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4cfd0885-0776-471c-b8f4-afb359e460b2/rabbitmq/0.log" Jan 28 08:13:38 crc kubenswrapper[4776]: I0128 08:13:38.532130 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm_e7a8bf84-fa6c-4637-bac6-cb9da6206f31/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:13:38 crc kubenswrapper[4776]: I0128 08:13:38.683375 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-m4zrt_de16818c-1081-4db9-a329-04c845b7ec51/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:13:38 crc kubenswrapper[4776]: I0128 08:13:38.753284 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8_496f75bc-3d43-4af8-8bf4-c818f9b4db9d/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:13:38 crc kubenswrapper[4776]: I0128 08:13:38.923531 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-zcz2d_fa499a35-59bb-4ee1-93b4-98ab890c2126/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:13:38 crc kubenswrapper[4776]: I0128 08:13:38.973537 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-vb66j_f2d64cd0-fb47-4169-88d3-dec3bb7591b0/ssh-known-hosts-edpm-deployment/0.log" Jan 28 08:13:39 crc kubenswrapper[4776]: I0128 08:13:39.265667 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76c7dcc8f9-s7l5r_157a7da1-1327-40fc-83f3-30c1ef472c78/proxy-server/0.log" Jan 28 08:13:39 crc kubenswrapper[4776]: I0128 08:13:39.374038 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-cmmlx_a83b6bd4-3813-465a-aa62-8bb029d2fcc0/swift-ring-rebalance/0.log" Jan 28 08:13:39 crc kubenswrapper[4776]: I0128 08:13:39.435468 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76c7dcc8f9-s7l5r_157a7da1-1327-40fc-83f3-30c1ef472c78/proxy-httpd/0.log" Jan 28 08:13:39 crc kubenswrapper[4776]: I0128 08:13:39.467960 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/account-auditor/0.log" Jan 28 08:13:39 crc kubenswrapper[4776]: I0128 08:13:39.616886 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/account-reaper/0.log" Jan 28 08:13:39 crc kubenswrapper[4776]: I0128 08:13:39.700844 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/account-replicator/0.log" Jan 28 08:13:39 crc kubenswrapper[4776]: I0128 08:13:39.705173 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/container-auditor/0.log" Jan 28 08:13:39 crc kubenswrapper[4776]: I0128 08:13:39.743771 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/account-server/0.log" Jan 28 08:13:39 crc kubenswrapper[4776]: I0128 08:13:39.862134 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/container-replicator/0.log" Jan 28 08:13:39 crc kubenswrapper[4776]: I0128 08:13:39.916991 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/container-updater/0.log" Jan 28 08:13:39 crc kubenswrapper[4776]: I0128 08:13:39.952984 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/container-server/0.log" Jan 28 08:13:40 crc kubenswrapper[4776]: I0128 08:13:40.010429 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/object-auditor/0.log" Jan 28 08:13:40 crc kubenswrapper[4776]: I0128 08:13:40.087051 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/object-expirer/0.log" Jan 28 08:13:40 crc kubenswrapper[4776]: I0128 08:13:40.109839 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/object-replicator/0.log" Jan 28 08:13:40 crc kubenswrapper[4776]: I0128 08:13:40.158407 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/object-server/0.log" Jan 28 08:13:40 crc kubenswrapper[4776]: I0128 08:13:40.214571 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/object-updater/0.log" Jan 28 08:13:40 crc kubenswrapper[4776]: I0128 08:13:40.310805 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/rsync/0.log" Jan 28 08:13:40 crc kubenswrapper[4776]: I0128 08:13:40.356182 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/swift-recon-cron/0.log" Jan 28 08:13:40 crc kubenswrapper[4776]: I0128 08:13:40.459333 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-fp67m_1dde4f71-00f6-46fa-b16c-429edb9ee1ce/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:13:40 crc kubenswrapper[4776]: I0128 08:13:40.553489 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_0605b294-d429-4bfd-8924-39f8cb5cb105/tempest-tests-tempest-tests-runner/0.log" Jan 28 08:13:40 crc kubenswrapper[4776]: I0128 08:13:40.626744 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_40947156-8378-437f-935a-da00e0908508/test-operator-logs-container/0.log" Jan 28 08:13:40 crc kubenswrapper[4776]: I0128 08:13:40.782106 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b_cb26e5c4-e4de-4bca-86c0-160dffb2bb73/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:13:41 crc kubenswrapper[4776]: I0128 08:13:41.553087 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_e0bb7f08-c9fa-4595-9b5e-b80ff3821169/watcher-applier/0.log" Jan 28 08:13:41 crc kubenswrapper[4776]: I0128 08:13:41.923259 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_8c06de04-7886-4696-8416-3559c16a5f7f/watcher-api-log/0.log" Jan 28 08:13:42 crc kubenswrapper[4776]: I0128 08:13:42.607290 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_c2cf5aeb-349c-47d3-989b-d56e91f7ff51/watcher-decision-engine/0.log" Jan 28 08:13:44 crc kubenswrapper[4776]: I0128 08:13:44.034881 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f1377523-89dd-4311-886a-af2f7bb607b8/memcached/0.log" Jan 28 08:13:44 crc kubenswrapper[4776]: I0128 08:13:44.559198 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_8c06de04-7886-4696-8416-3559c16a5f7f/watcher-api/0.log" Jan 28 08:14:03 crc kubenswrapper[4776]: I0128 08:14:03.852638 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:14:03 crc kubenswrapper[4776]: I0128 08:14:03.853082 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:14:09 crc kubenswrapper[4776]: I0128 08:14:09.515304 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-65ff799cfd-vzbmt_bd49109f-40b2-4db9-92d7-75aaf1093a21/manager/0.log" Jan 28 08:14:09 crc kubenswrapper[4776]: I0128 08:14:09.669218 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-655bf9cfbb-wjdst_79937ab5-c85f-4a4a-b35f-3b5d3711cbf0/manager/0.log" Jan 28 08:14:09 crc kubenswrapper[4776]: I0128 08:14:09.772362 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-77554cdc5c-vpc6t_b5c3560a-18be-4f65-a9f7-0dddccb36193/manager/0.log" Jan 28 08:14:09 crc kubenswrapper[4776]: I0128 08:14:09.875741 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v_29726b5f-7cef-4a70-8004-88f628782852/util/0.log" Jan 28 08:14:10 crc kubenswrapper[4776]: I0128 08:14:10.025431 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v_29726b5f-7cef-4a70-8004-88f628782852/util/0.log" Jan 28 08:14:10 crc kubenswrapper[4776]: I0128 08:14:10.040747 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v_29726b5f-7cef-4a70-8004-88f628782852/pull/0.log" Jan 28 08:14:10 crc kubenswrapper[4776]: I0128 08:14:10.044110 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v_29726b5f-7cef-4a70-8004-88f628782852/pull/0.log" Jan 28 08:14:10 crc kubenswrapper[4776]: I0128 08:14:10.201245 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v_29726b5f-7cef-4a70-8004-88f628782852/pull/0.log" Jan 28 08:14:10 crc kubenswrapper[4776]: I0128 08:14:10.236706 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v_29726b5f-7cef-4a70-8004-88f628782852/extract/0.log" Jan 28 08:14:10 crc kubenswrapper[4776]: I0128 08:14:10.242299 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v_29726b5f-7cef-4a70-8004-88f628782852/util/0.log" Jan 28 08:14:10 crc kubenswrapper[4776]: I0128 08:14:10.503160 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-575ffb885b-pxsb4_1a0ddddf-b0e4-4bdb-bf00-c978366213a0/manager/0.log" Jan 28 08:14:10 crc kubenswrapper[4776]: I0128 08:14:10.524261 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-67dd55ff59-xw2v6_11a6de65-3758-4462-b2b0-9499232f8c29/manager/0.log" Jan 28 08:14:10 crc kubenswrapper[4776]: I0128 08:14:10.809906 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-vwpnd_60dda427-fb0c-41c7-8ca8-9847554068f1/manager/0.log" Jan 28 08:14:10 crc kubenswrapper[4776]: I0128 08:14:10.925576 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-768b776ffb-4mt8c_846af064-1eb1-4384-9b88-95770199bcdc/manager/0.log" Jan 28 08:14:10 crc kubenswrapper[4776]: I0128 08:14:10.957141 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d75bc88d5-2fbq4_f9f1432a-2977-49f8-924a-5c82c86f1de0/manager/0.log" Jan 28 08:14:11 crc kubenswrapper[4776]: I0128 08:14:11.117719 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55f684fd56-d2j6c_bf30e81e-a5a3-4af7-9a47-673f431d3666/manager/0.log" Jan 28 08:14:11 crc kubenswrapper[4776]: I0128 08:14:11.166487 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-849fcfbb6b-pxjl4_39d3648e-5826-4e8a-b252-cb75e28651db/manager/0.log" Jan 28 08:14:11 crc kubenswrapper[4776]: I0128 08:14:11.317405 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6b9fb5fdcb-lltvt_93dd9036-0e5e-4817-9a6c-eb89469de01b/manager/0.log" Jan 28 08:14:11 crc kubenswrapper[4776]: I0128 08:14:11.383206 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7ffd8d76d4-vrlcf_38646136-0a67-43c4-90ee-d88ae407d654/manager/0.log" Jan 28 08:14:11 crc kubenswrapper[4776]: I0128 08:14:11.588313 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-ddcbfd695-wwhp5_6aa705d0-91c0-48eb-a5ed-ab6afb16b6f7/manager/0.log" Jan 28 08:14:11 crc kubenswrapper[4776]: I0128 08:14:11.616766 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-94dd99d7d-gxmgb_a4407ded-de50-4ae5-bf84-2d6a3baa565c/manager/0.log" Jan 28 08:14:11 crc kubenswrapper[4776]: I0128 08:14:11.758744 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854hbb56_9eae60fd-6135-4e41-bb77-e3caae71237d/manager/0.log" Jan 28 08:14:11 crc kubenswrapper[4776]: I0128 08:14:11.999454 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5667c869b5-csrzg_bca7b855-4473-4cc2-aa88-38fd3de8fea8/operator/0.log" Jan 28 08:14:12 crc kubenswrapper[4776]: I0128 08:14:12.211160 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-njlm4_ce3e5ab9-01db-487b-9176-60b655f03b9b/registry-server/0.log" Jan 28 08:14:12 crc kubenswrapper[4776]: I0128 08:14:12.401573 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f75f45d54-4xx26_6a91b170-b0ed-4156-a9ee-74efca2560e7/manager/0.log" Jan 28 08:14:12 crc kubenswrapper[4776]: I0128 08:14:12.520633 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-79d5ccc684-6kqqd_9390b35e-9791-4ef4-ab66-12c4662f4cdf/manager/0.log" Jan 28 08:14:12 crc kubenswrapper[4776]: I0128 08:14:12.671901 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-8wdwl_22f3f762-cc29-4a18-8bfc-430b85e041cc/operator/0.log" Jan 28 08:14:13 crc kubenswrapper[4776]: I0128 08:14:13.020688 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-547cbdb99f-8w6p2_5bc7efb1-0792-40f2-993a-eb865919048c/manager/0.log" Jan 28 08:14:13 crc kubenswrapper[4776]: I0128 08:14:13.280034 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-799bc87c89-jfcj6_6f563213-8471-44f5-83aa-820e73ed7746/manager/0.log" Jan 28 08:14:13 crc kubenswrapper[4776]: I0128 08:14:13.299449 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6fdbb46688-tnzx5_70aa7185-ded8-4807-822c-69fc5b03feeb/manager/0.log" Jan 28 08:14:13 crc kubenswrapper[4776]: I0128 08:14:13.780074 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-69797bbcbd-8dl97_2f79777a-6f48-42d4-b39e-4393e932aea0/manager/0.log" Jan 28 08:14:13 crc kubenswrapper[4776]: I0128 08:14:13.850196 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-66fbd46fdf-dpq5g_3b6f6ae6-4641-4dd2-9021-197e9ea97b2b/manager/0.log" Jan 28 08:14:33 crc kubenswrapper[4776]: I0128 08:14:33.852418 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:14:33 crc kubenswrapper[4776]: I0128 08:14:33.853011 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:14:34 crc kubenswrapper[4776]: I0128 08:14:34.412330 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-5bxjg_a8386b67-8be2-4d18-9358-fccd65c363db/control-plane-machine-set-operator/0.log" Jan 28 08:14:34 crc kubenswrapper[4776]: I0128 08:14:34.547049 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xm8h7_4fb7ccb2-9c11-4273-9888-f45aea05803d/kube-rbac-proxy/0.log" Jan 28 08:14:34 crc kubenswrapper[4776]: I0128 08:14:34.605104 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xm8h7_4fb7ccb2-9c11-4273-9888-f45aea05803d/machine-api-operator/0.log" Jan 28 08:14:41 crc kubenswrapper[4776]: I0128 08:14:41.539154 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cc976"] Jan 28 08:14:41 crc kubenswrapper[4776]: E0128 08:14:41.540094 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1ee538-719b-431b-93da-77b1de400472" containerName="container-00" Jan 28 08:14:41 crc kubenswrapper[4776]: I0128 08:14:41.540106 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1ee538-719b-431b-93da-77b1de400472" containerName="container-00" Jan 28 08:14:41 crc kubenswrapper[4776]: I0128 08:14:41.540331 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1ee538-719b-431b-93da-77b1de400472" containerName="container-00" Jan 28 08:14:41 crc kubenswrapper[4776]: I0128 08:14:41.541721 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cc976" Jan 28 08:14:41 crc kubenswrapper[4776]: I0128 08:14:41.563643 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cc976"] Jan 28 08:14:41 crc kubenswrapper[4776]: I0128 08:14:41.652254 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-catalog-content\") pod \"redhat-marketplace-cc976\" (UID: \"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4\") " pod="openshift-marketplace/redhat-marketplace-cc976" Jan 28 08:14:41 crc kubenswrapper[4776]: I0128 08:14:41.652501 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjdjh\" (UniqueName: \"kubernetes.io/projected/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-kube-api-access-gjdjh\") pod \"redhat-marketplace-cc976\" (UID: \"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4\") " pod="openshift-marketplace/redhat-marketplace-cc976" Jan 28 08:14:41 crc kubenswrapper[4776]: I0128 08:14:41.652538 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-utilities\") pod \"redhat-marketplace-cc976\" (UID: \"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4\") " pod="openshift-marketplace/redhat-marketplace-cc976" Jan 28 08:14:41 crc kubenswrapper[4776]: I0128 08:14:41.754153 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjdjh\" (UniqueName: \"kubernetes.io/projected/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-kube-api-access-gjdjh\") pod \"redhat-marketplace-cc976\" (UID: \"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4\") " pod="openshift-marketplace/redhat-marketplace-cc976" Jan 28 08:14:41 crc kubenswrapper[4776]: I0128 08:14:41.754203 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-utilities\") pod \"redhat-marketplace-cc976\" (UID: \"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4\") " pod="openshift-marketplace/redhat-marketplace-cc976" Jan 28 08:14:41 crc kubenswrapper[4776]: I0128 08:14:41.754235 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-catalog-content\") pod \"redhat-marketplace-cc976\" (UID: \"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4\") " pod="openshift-marketplace/redhat-marketplace-cc976" Jan 28 08:14:41 crc kubenswrapper[4776]: I0128 08:14:41.754697 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-utilities\") pod \"redhat-marketplace-cc976\" (UID: \"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4\") " pod="openshift-marketplace/redhat-marketplace-cc976" Jan 28 08:14:41 crc kubenswrapper[4776]: I0128 08:14:41.754736 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-catalog-content\") pod \"redhat-marketplace-cc976\" (UID: \"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4\") " pod="openshift-marketplace/redhat-marketplace-cc976" Jan 28 08:14:41 crc kubenswrapper[4776]: I0128 08:14:41.774851 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjdjh\" (UniqueName: \"kubernetes.io/projected/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-kube-api-access-gjdjh\") pod \"redhat-marketplace-cc976\" (UID: \"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4\") " pod="openshift-marketplace/redhat-marketplace-cc976" Jan 28 08:14:41 crc kubenswrapper[4776]: I0128 08:14:41.861671 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cc976" Jan 28 08:14:42 crc kubenswrapper[4776]: I0128 08:14:42.357396 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cc976"] Jan 28 08:14:42 crc kubenswrapper[4776]: I0128 08:14:42.497438 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cc976" event={"ID":"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4","Type":"ContainerStarted","Data":"4bf5ad2ace59e2c5b8077e13b48597dbbf4e4810590502204de972210c781b19"} Jan 28 08:14:43 crc kubenswrapper[4776]: I0128 08:14:43.507149 4776 generic.go:334] "Generic (PLEG): container finished" podID="5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4" containerID="ca393cc3f5e18dde4ceb50039131a0ad3bdb2fb3cb302bbead21c156892881e7" exitCode=0 Jan 28 08:14:43 crc kubenswrapper[4776]: I0128 08:14:43.507197 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cc976" event={"ID":"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4","Type":"ContainerDied","Data":"ca393cc3f5e18dde4ceb50039131a0ad3bdb2fb3cb302bbead21c156892881e7"} Jan 28 08:14:43 crc kubenswrapper[4776]: I0128 08:14:43.510468 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 08:14:44 crc kubenswrapper[4776]: I0128 08:14:44.518323 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cc976" event={"ID":"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4","Type":"ContainerStarted","Data":"e3922801ca604d489222d8b0e7f72633f9c9c7c063c3abcf1f3c58087948fa86"} Jan 28 08:14:45 crc kubenswrapper[4776]: I0128 08:14:45.531472 4776 generic.go:334] "Generic (PLEG): container finished" podID="5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4" containerID="e3922801ca604d489222d8b0e7f72633f9c9c7c063c3abcf1f3c58087948fa86" exitCode=0 Jan 28 08:14:45 crc kubenswrapper[4776]: I0128 08:14:45.531518 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cc976" event={"ID":"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4","Type":"ContainerDied","Data":"e3922801ca604d489222d8b0e7f72633f9c9c7c063c3abcf1f3c58087948fa86"} Jan 28 08:14:46 crc kubenswrapper[4776]: I0128 08:14:46.544781 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cc976" event={"ID":"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4","Type":"ContainerStarted","Data":"8e5f8d5da035f33e9beb628d673f618705ad24e411c640a6a769f75f4924ae7a"} Jan 28 08:14:46 crc kubenswrapper[4776]: I0128 08:14:46.564948 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cc976" podStartSLOduration=3.155093399 podStartE2EDuration="5.56493125s" podCreationTimestamp="2026-01-28 08:14:41 +0000 UTC" firstStartedPulling="2026-01-28 08:14:43.510239551 +0000 UTC m=+5054.925899711" lastFinishedPulling="2026-01-28 08:14:45.920077402 +0000 UTC m=+5057.335737562" observedRunningTime="2026-01-28 08:14:46.563405498 +0000 UTC m=+5057.979065668" watchObservedRunningTime="2026-01-28 08:14:46.56493125 +0000 UTC m=+5057.980591410" Jan 28 08:14:49 crc kubenswrapper[4776]: I0128 08:14:49.409903 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-kh746_ebf51615-2906-4bc1-9224-7bdc14f6afa6/cert-manager-controller/0.log" Jan 28 08:14:49 crc kubenswrapper[4776]: I0128 08:14:49.590931 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-jrkz5_8c08bbc8-20fd-452e-8d53-4baa6ac41fc2/cert-manager-cainjector/0.log" Jan 28 08:14:49 crc kubenswrapper[4776]: I0128 08:14:49.735328 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-5lrjh_24f93919-f6ec-481d-b6f3-0bfd6fdb7e01/cert-manager-webhook/0.log" Jan 28 08:14:51 crc kubenswrapper[4776]: I0128 08:14:51.861930 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cc976" Jan 28 08:14:51 crc kubenswrapper[4776]: I0128 08:14:51.862270 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cc976" Jan 28 08:14:51 crc kubenswrapper[4776]: I0128 08:14:51.931479 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cc976" Jan 28 08:14:52 crc kubenswrapper[4776]: I0128 08:14:52.652318 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cc976" Jan 28 08:14:52 crc kubenswrapper[4776]: I0128 08:14:52.710176 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cc976"] Jan 28 08:14:54 crc kubenswrapper[4776]: I0128 08:14:54.617048 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cc976" podUID="5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4" containerName="registry-server" containerID="cri-o://8e5f8d5da035f33e9beb628d673f618705ad24e411c640a6a769f75f4924ae7a" gracePeriod=2 Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.102361 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cc976" Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.215127 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-catalog-content\") pod \"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4\" (UID: \"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4\") " Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.215257 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-utilities\") pod \"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4\" (UID: \"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4\") " Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.215392 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjdjh\" (UniqueName: \"kubernetes.io/projected/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-kube-api-access-gjdjh\") pod \"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4\" (UID: \"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4\") " Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.215989 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-utilities" (OuterVolumeSpecName: "utilities") pod "5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4" (UID: "5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.223260 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-kube-api-access-gjdjh" (OuterVolumeSpecName: "kube-api-access-gjdjh") pod "5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4" (UID: "5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4"). InnerVolumeSpecName "kube-api-access-gjdjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.244025 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4" (UID: "5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.317881 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.317912 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.317921 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjdjh\" (UniqueName: \"kubernetes.io/projected/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4-kube-api-access-gjdjh\") on node \"crc\" DevicePath \"\"" Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.632443 4776 generic.go:334] "Generic (PLEG): container finished" podID="5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4" containerID="8e5f8d5da035f33e9beb628d673f618705ad24e411c640a6a769f75f4924ae7a" exitCode=0 Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.633380 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cc976" event={"ID":"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4","Type":"ContainerDied","Data":"8e5f8d5da035f33e9beb628d673f618705ad24e411c640a6a769f75f4924ae7a"} Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.633593 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cc976" event={"ID":"5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4","Type":"ContainerDied","Data":"4bf5ad2ace59e2c5b8077e13b48597dbbf4e4810590502204de972210c781b19"} Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.633619 4776 scope.go:117] "RemoveContainer" containerID="8e5f8d5da035f33e9beb628d673f618705ad24e411c640a6a769f75f4924ae7a" Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.633397 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cc976" Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.671206 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cc976"] Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.674983 4776 scope.go:117] "RemoveContainer" containerID="e3922801ca604d489222d8b0e7f72633f9c9c7c063c3abcf1f3c58087948fa86" Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.681820 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cc976"] Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.695744 4776 scope.go:117] "RemoveContainer" containerID="ca393cc3f5e18dde4ceb50039131a0ad3bdb2fb3cb302bbead21c156892881e7" Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.743123 4776 scope.go:117] "RemoveContainer" containerID="8e5f8d5da035f33e9beb628d673f618705ad24e411c640a6a769f75f4924ae7a" Jan 28 08:14:55 crc kubenswrapper[4776]: E0128 08:14:55.743536 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e5f8d5da035f33e9beb628d673f618705ad24e411c640a6a769f75f4924ae7a\": container with ID starting with 8e5f8d5da035f33e9beb628d673f618705ad24e411c640a6a769f75f4924ae7a not found: ID does not exist" containerID="8e5f8d5da035f33e9beb628d673f618705ad24e411c640a6a769f75f4924ae7a" Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.743600 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e5f8d5da035f33e9beb628d673f618705ad24e411c640a6a769f75f4924ae7a"} err="failed to get container status \"8e5f8d5da035f33e9beb628d673f618705ad24e411c640a6a769f75f4924ae7a\": rpc error: code = NotFound desc = could not find container \"8e5f8d5da035f33e9beb628d673f618705ad24e411c640a6a769f75f4924ae7a\": container with ID starting with 8e5f8d5da035f33e9beb628d673f618705ad24e411c640a6a769f75f4924ae7a not found: ID does not exist" Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.743631 4776 scope.go:117] "RemoveContainer" containerID="e3922801ca604d489222d8b0e7f72633f9c9c7c063c3abcf1f3c58087948fa86" Jan 28 08:14:55 crc kubenswrapper[4776]: E0128 08:14:55.744057 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3922801ca604d489222d8b0e7f72633f9c9c7c063c3abcf1f3c58087948fa86\": container with ID starting with e3922801ca604d489222d8b0e7f72633f9c9c7c063c3abcf1f3c58087948fa86 not found: ID does not exist" containerID="e3922801ca604d489222d8b0e7f72633f9c9c7c063c3abcf1f3c58087948fa86" Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.744096 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3922801ca604d489222d8b0e7f72633f9c9c7c063c3abcf1f3c58087948fa86"} err="failed to get container status \"e3922801ca604d489222d8b0e7f72633f9c9c7c063c3abcf1f3c58087948fa86\": rpc error: code = NotFound desc = could not find container \"e3922801ca604d489222d8b0e7f72633f9c9c7c063c3abcf1f3c58087948fa86\": container with ID starting with e3922801ca604d489222d8b0e7f72633f9c9c7c063c3abcf1f3c58087948fa86 not found: ID does not exist" Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.744123 4776 scope.go:117] "RemoveContainer" containerID="ca393cc3f5e18dde4ceb50039131a0ad3bdb2fb3cb302bbead21c156892881e7" Jan 28 08:14:55 crc kubenswrapper[4776]: E0128 08:14:55.744452 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca393cc3f5e18dde4ceb50039131a0ad3bdb2fb3cb302bbead21c156892881e7\": container with ID starting with ca393cc3f5e18dde4ceb50039131a0ad3bdb2fb3cb302bbead21c156892881e7 not found: ID does not exist" containerID="ca393cc3f5e18dde4ceb50039131a0ad3bdb2fb3cb302bbead21c156892881e7" Jan 28 08:14:55 crc kubenswrapper[4776]: I0128 08:14:55.744486 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca393cc3f5e18dde4ceb50039131a0ad3bdb2fb3cb302bbead21c156892881e7"} err="failed to get container status \"ca393cc3f5e18dde4ceb50039131a0ad3bdb2fb3cb302bbead21c156892881e7\": rpc error: code = NotFound desc = could not find container \"ca393cc3f5e18dde4ceb50039131a0ad3bdb2fb3cb302bbead21c156892881e7\": container with ID starting with ca393cc3f5e18dde4ceb50039131a0ad3bdb2fb3cb302bbead21c156892881e7 not found: ID does not exist" Jan 28 08:14:57 crc kubenswrapper[4776]: I0128 08:14:57.325766 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4" path="/var/lib/kubelet/pods/5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4/volumes" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.147179 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp"] Jan 28 08:15:00 crc kubenswrapper[4776]: E0128 08:15:00.148269 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4" containerName="registry-server" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.148287 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4" containerName="registry-server" Jan 28 08:15:00 crc kubenswrapper[4776]: E0128 08:15:00.148313 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4" containerName="extract-content" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.148320 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4" containerName="extract-content" Jan 28 08:15:00 crc kubenswrapper[4776]: E0128 08:15:00.148344 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4" containerName="extract-utilities" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.148355 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4" containerName="extract-utilities" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.148580 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5025fbd6-ba6e-48f6-8b1e-81b7ae716ff4" containerName="registry-server" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.149387 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.151397 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.151688 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.158968 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp"] Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.217604 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsdh7\" (UniqueName: \"kubernetes.io/projected/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-kube-api-access-vsdh7\") pod \"collect-profiles-29493135-rzvnp\" (UID: \"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.217908 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-secret-volume\") pod \"collect-profiles-29493135-rzvnp\" (UID: \"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.218035 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-config-volume\") pod \"collect-profiles-29493135-rzvnp\" (UID: \"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.319976 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-secret-volume\") pod \"collect-profiles-29493135-rzvnp\" (UID: \"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.320122 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-config-volume\") pod \"collect-profiles-29493135-rzvnp\" (UID: \"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.320187 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsdh7\" (UniqueName: \"kubernetes.io/projected/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-kube-api-access-vsdh7\") pod \"collect-profiles-29493135-rzvnp\" (UID: \"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.321719 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-config-volume\") pod \"collect-profiles-29493135-rzvnp\" (UID: \"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.326514 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-secret-volume\") pod \"collect-profiles-29493135-rzvnp\" (UID: \"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.341731 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsdh7\" (UniqueName: \"kubernetes.io/projected/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-kube-api-access-vsdh7\") pod \"collect-profiles-29493135-rzvnp\" (UID: \"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.482868 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp" Jan 28 08:15:00 crc kubenswrapper[4776]: I0128 08:15:00.973927 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp"] Jan 28 08:15:01 crc kubenswrapper[4776]: I0128 08:15:01.712643 4776 generic.go:334] "Generic (PLEG): container finished" podID="1bfcdacc-09be-4a5c-8a9e-5c730adb19fa" containerID="34a0eb627f83e563b8779e0a47022805fbac1b779732a4823d16a761f6a2f542" exitCode=0 Jan 28 08:15:01 crc kubenswrapper[4776]: I0128 08:15:01.712739 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp" event={"ID":"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa","Type":"ContainerDied","Data":"34a0eb627f83e563b8779e0a47022805fbac1b779732a4823d16a761f6a2f542"} Jan 28 08:15:01 crc kubenswrapper[4776]: I0128 08:15:01.713077 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp" event={"ID":"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa","Type":"ContainerStarted","Data":"748994368e126a63665bcccf4ddeb9e3b630081090ae2ba77c3008d9ee0704c2"} Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.134502 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp" Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.207786 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-config-volume\") pod \"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa\" (UID: \"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa\") " Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.207888 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-secret-volume\") pod \"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa\" (UID: \"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa\") " Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.208007 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsdh7\" (UniqueName: \"kubernetes.io/projected/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-kube-api-access-vsdh7\") pod \"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa\" (UID: \"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa\") " Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.208690 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-config-volume" (OuterVolumeSpecName: "config-volume") pod "1bfcdacc-09be-4a5c-8a9e-5c730adb19fa" (UID: "1bfcdacc-09be-4a5c-8a9e-5c730adb19fa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.209103 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.230898 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1bfcdacc-09be-4a5c-8a9e-5c730adb19fa" (UID: "1bfcdacc-09be-4a5c-8a9e-5c730adb19fa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.230911 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-kube-api-access-vsdh7" (OuterVolumeSpecName: "kube-api-access-vsdh7") pod "1bfcdacc-09be-4a5c-8a9e-5c730adb19fa" (UID: "1bfcdacc-09be-4a5c-8a9e-5c730adb19fa"). InnerVolumeSpecName "kube-api-access-vsdh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.312013 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.312203 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsdh7\" (UniqueName: \"kubernetes.io/projected/1bfcdacc-09be-4a5c-8a9e-5c730adb19fa-kube-api-access-vsdh7\") on node \"crc\" DevicePath \"\"" Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.731248 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp" event={"ID":"1bfcdacc-09be-4a5c-8a9e-5c730adb19fa","Type":"ContainerDied","Data":"748994368e126a63665bcccf4ddeb9e3b630081090ae2ba77c3008d9ee0704c2"} Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.731302 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="748994368e126a63665bcccf4ddeb9e3b630081090ae2ba77c3008d9ee0704c2" Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.731424 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493135-rzvnp" Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.737423 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-49nnl_59581e1b-5fa1-4649-b461-20815879a250/nmstate-console-plugin/0.log" Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.851799 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.851863 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.851925 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.852779 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7218f61cf76b54d31c01362ba5acfec5b8d9c04a058abe1ab823351fb2870817"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.852857 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://7218f61cf76b54d31c01362ba5acfec5b8d9c04a058abe1ab823351fb2870817" gracePeriod=600 Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.936351 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-fn6dp_087b920d-366b-475c-85f2-e5512596d3f8/nmstate-handler/0.log" Jan 28 08:15:03 crc kubenswrapper[4776]: I0128 08:15:03.958867 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-qgksg_2f1d6d84-d95e-4423-a7c1-7fa987beff1c/kube-rbac-proxy/0.log" Jan 28 08:15:04 crc kubenswrapper[4776]: I0128 08:15:04.069595 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-qgksg_2f1d6d84-d95e-4423-a7c1-7fa987beff1c/nmstate-metrics/0.log" Jan 28 08:15:04 crc kubenswrapper[4776]: I0128 08:15:04.182167 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-5x9ww_180b60f1-288a-4292-9aab-4322b1d1bce2/nmstate-operator/0.log" Jan 28 08:15:04 crc kubenswrapper[4776]: I0128 08:15:04.218065 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6"] Jan 28 08:15:04 crc kubenswrapper[4776]: I0128 08:15:04.227148 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493090-zkzm6"] Jan 28 08:15:04 crc kubenswrapper[4776]: I0128 08:15:04.346233 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-mjfqk_ff12b52a-7e92-45bd-afd9-e0b577a8607d/nmstate-webhook/0.log" Jan 28 08:15:04 crc kubenswrapper[4776]: I0128 08:15:04.740957 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="7218f61cf76b54d31c01362ba5acfec5b8d9c04a058abe1ab823351fb2870817" exitCode=0 Jan 28 08:15:04 crc kubenswrapper[4776]: I0128 08:15:04.741019 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"7218f61cf76b54d31c01362ba5acfec5b8d9c04a058abe1ab823351fb2870817"} Jan 28 08:15:04 crc kubenswrapper[4776]: I0128 08:15:04.741318 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1"} Jan 28 08:15:04 crc kubenswrapper[4776]: I0128 08:15:04.741355 4776 scope.go:117] "RemoveContainer" containerID="827c0d32ac75e02c44c473afbf58acc520f768147e36f451171e4eab6c4d272a" Jan 28 08:15:05 crc kubenswrapper[4776]: I0128 08:15:05.317565 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19c50c04-4144-4b30-899a-c0ed5e61eb11" path="/var/lib/kubelet/pods/19c50c04-4144-4b30-899a-c0ed5e61eb11/volumes" Jan 28 08:15:18 crc kubenswrapper[4776]: I0128 08:15:18.730057 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-9v6mf_633bf947-38aa-4444-911d-ea2f55433a93/prometheus-operator/0.log" Jan 28 08:15:18 crc kubenswrapper[4776]: I0128 08:15:18.873167 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp_f52dbbd1-d020-4074-93eb-706fff6e588b/prometheus-operator-admission-webhook/0.log" Jan 28 08:15:18 crc kubenswrapper[4776]: I0128 08:15:18.906502 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf_194dde85-71e7-4d74-80c4-59e327ac851a/prometheus-operator-admission-webhook/0.log" Jan 28 08:15:19 crc kubenswrapper[4776]: I0128 08:15:19.091779 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-srqjl_215d3c95-e6d6-4022-a435-f6c30c630727/operator/0.log" Jan 28 08:15:19 crc kubenswrapper[4776]: I0128 08:15:19.110692 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-qd6sh_f1812216-a0b3-4ae2-9c2c-7086dc74163b/perses-operator/0.log" Jan 28 08:15:33 crc kubenswrapper[4776]: I0128 08:15:33.481820 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-9qzfc_896d6757-3340-421c-937a-d6e35e752bdc/kube-rbac-proxy/0.log" Jan 28 08:15:33 crc kubenswrapper[4776]: I0128 08:15:33.539140 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-9qzfc_896d6757-3340-421c-937a-d6e35e752bdc/controller/0.log" Jan 28 08:15:33 crc kubenswrapper[4776]: I0128 08:15:33.678932 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-frr-files/0.log" Jan 28 08:15:33 crc kubenswrapper[4776]: I0128 08:15:33.814804 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-reloader/0.log" Jan 28 08:15:33 crc kubenswrapper[4776]: I0128 08:15:33.829376 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-frr-files/0.log" Jan 28 08:15:33 crc kubenswrapper[4776]: I0128 08:15:33.837586 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-metrics/0.log" Jan 28 08:15:33 crc kubenswrapper[4776]: I0128 08:15:33.874408 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-reloader/0.log" Jan 28 08:15:34 crc kubenswrapper[4776]: I0128 08:15:34.348584 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-metrics/0.log" Jan 28 08:15:34 crc kubenswrapper[4776]: I0128 08:15:34.385570 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-reloader/0.log" Jan 28 08:15:34 crc kubenswrapper[4776]: I0128 08:15:34.439877 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-metrics/0.log" Jan 28 08:15:34 crc kubenswrapper[4776]: I0128 08:15:34.453652 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-frr-files/0.log" Jan 28 08:15:34 crc kubenswrapper[4776]: I0128 08:15:34.602383 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-frr-files/0.log" Jan 28 08:15:34 crc kubenswrapper[4776]: I0128 08:15:34.637971 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-reloader/0.log" Jan 28 08:15:34 crc kubenswrapper[4776]: I0128 08:15:34.658186 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-metrics/0.log" Jan 28 08:15:34 crc kubenswrapper[4776]: I0128 08:15:34.659220 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/controller/0.log" Jan 28 08:15:34 crc kubenswrapper[4776]: I0128 08:15:34.790553 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/frr-metrics/0.log" Jan 28 08:15:34 crc kubenswrapper[4776]: I0128 08:15:34.814688 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/kube-rbac-proxy/0.log" Jan 28 08:15:34 crc kubenswrapper[4776]: I0128 08:15:34.863363 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/kube-rbac-proxy-frr/0.log" Jan 28 08:15:35 crc kubenswrapper[4776]: I0128 08:15:35.011710 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/reloader/0.log" Jan 28 08:15:35 crc kubenswrapper[4776]: I0128 08:15:35.130576 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-mbw9s_a349654d-030c-4341-b884-8f295ea9dfa9/frr-k8s-webhook-server/0.log" Jan 28 08:15:35 crc kubenswrapper[4776]: I0128 08:15:35.224763 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-74f4f84-5s97b_b127309b-519f-42d4-9aca-30708ae2aae1/manager/0.log" Jan 28 08:15:35 crc kubenswrapper[4776]: I0128 08:15:35.437902 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-f68d4f57-8pgd6_10b029bb-8821-4602-9b1d-910d59efc97a/webhook-server/0.log" Jan 28 08:15:35 crc kubenswrapper[4776]: I0128 08:15:35.489942 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mjlbx_6f2ff038-c715-4cff-a872-ac6ae5c7fbff/kube-rbac-proxy/0.log" Jan 28 08:15:36 crc kubenswrapper[4776]: I0128 08:15:36.029981 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mjlbx_6f2ff038-c715-4cff-a872-ac6ae5c7fbff/speaker/0.log" Jan 28 08:15:36 crc kubenswrapper[4776]: I0128 08:15:36.358312 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/frr/0.log" Jan 28 08:15:50 crc kubenswrapper[4776]: I0128 08:15:50.151400 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9_212e186f-3642-483d-adf6-00dfaf77ca5f/util/0.log" Jan 28 08:15:50 crc kubenswrapper[4776]: I0128 08:15:50.298861 4776 scope.go:117] "RemoveContainer" containerID="476fc56e33a0e03fc74a5babe6cfa95585e7cc3ac70af9eb6dd7aab2136ab12c" Jan 28 08:15:50 crc kubenswrapper[4776]: I0128 08:15:50.428741 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9_212e186f-3642-483d-adf6-00dfaf77ca5f/pull/0.log" Jan 28 08:15:50 crc kubenswrapper[4776]: I0128 08:15:50.438399 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9_212e186f-3642-483d-adf6-00dfaf77ca5f/util/0.log" Jan 28 08:15:50 crc kubenswrapper[4776]: I0128 08:15:50.446908 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9_212e186f-3642-483d-adf6-00dfaf77ca5f/pull/0.log" Jan 28 08:15:50 crc kubenswrapper[4776]: I0128 08:15:50.605612 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9_212e186f-3642-483d-adf6-00dfaf77ca5f/util/0.log" Jan 28 08:15:50 crc kubenswrapper[4776]: I0128 08:15:50.646887 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9_212e186f-3642-483d-adf6-00dfaf77ca5f/pull/0.log" Jan 28 08:15:50 crc kubenswrapper[4776]: I0128 08:15:50.682274 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9_212e186f-3642-483d-adf6-00dfaf77ca5f/extract/0.log" Jan 28 08:15:50 crc kubenswrapper[4776]: I0128 08:15:50.834124 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj_8d1bba84-5283-4516-94aa-2b7fa90c5e6d/util/0.log" Jan 28 08:15:51 crc kubenswrapper[4776]: I0128 08:15:51.127467 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj_8d1bba84-5283-4516-94aa-2b7fa90c5e6d/pull/0.log" Jan 28 08:15:51 crc kubenswrapper[4776]: I0128 08:15:51.146824 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj_8d1bba84-5283-4516-94aa-2b7fa90c5e6d/pull/0.log" Jan 28 08:15:51 crc kubenswrapper[4776]: I0128 08:15:51.188067 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj_8d1bba84-5283-4516-94aa-2b7fa90c5e6d/util/0.log" Jan 28 08:15:51 crc kubenswrapper[4776]: I0128 08:15:51.323570 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj_8d1bba84-5283-4516-94aa-2b7fa90c5e6d/util/0.log" Jan 28 08:15:51 crc kubenswrapper[4776]: I0128 08:15:51.364623 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj_8d1bba84-5283-4516-94aa-2b7fa90c5e6d/pull/0.log" Jan 28 08:15:51 crc kubenswrapper[4776]: I0128 08:15:51.408794 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj_8d1bba84-5283-4516-94aa-2b7fa90c5e6d/extract/0.log" Jan 28 08:15:51 crc kubenswrapper[4776]: I0128 08:15:51.550249 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv_4d23dd47-b538-454a-873c-b3cc6b26c92b/util/0.log" Jan 28 08:15:51 crc kubenswrapper[4776]: I0128 08:15:51.712849 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv_4d23dd47-b538-454a-873c-b3cc6b26c92b/pull/0.log" Jan 28 08:15:51 crc kubenswrapper[4776]: I0128 08:15:51.735637 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv_4d23dd47-b538-454a-873c-b3cc6b26c92b/pull/0.log" Jan 28 08:15:51 crc kubenswrapper[4776]: I0128 08:15:51.764074 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv_4d23dd47-b538-454a-873c-b3cc6b26c92b/util/0.log" Jan 28 08:15:51 crc kubenswrapper[4776]: I0128 08:15:51.883779 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv_4d23dd47-b538-454a-873c-b3cc6b26c92b/pull/0.log" Jan 28 08:15:51 crc kubenswrapper[4776]: I0128 08:15:51.919344 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv_4d23dd47-b538-454a-873c-b3cc6b26c92b/util/0.log" Jan 28 08:15:51 crc kubenswrapper[4776]: I0128 08:15:51.959828 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv_4d23dd47-b538-454a-873c-b3cc6b26c92b/extract/0.log" Jan 28 08:15:52 crc kubenswrapper[4776]: I0128 08:15:52.066628 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bfs7z_d6bbc905-926a-485a-a0da-4ac35f39505b/extract-utilities/0.log" Jan 28 08:15:52 crc kubenswrapper[4776]: I0128 08:15:52.300908 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bfs7z_d6bbc905-926a-485a-a0da-4ac35f39505b/extract-content/0.log" Jan 28 08:15:52 crc kubenswrapper[4776]: I0128 08:15:52.343016 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bfs7z_d6bbc905-926a-485a-a0da-4ac35f39505b/extract-content/0.log" Jan 28 08:15:52 crc kubenswrapper[4776]: I0128 08:15:52.350121 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bfs7z_d6bbc905-926a-485a-a0da-4ac35f39505b/extract-utilities/0.log" Jan 28 08:15:52 crc kubenswrapper[4776]: I0128 08:15:52.508697 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bfs7z_d6bbc905-926a-485a-a0da-4ac35f39505b/extract-utilities/0.log" Jan 28 08:15:52 crc kubenswrapper[4776]: I0128 08:15:52.570302 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bfs7z_d6bbc905-926a-485a-a0da-4ac35f39505b/extract-content/0.log" Jan 28 08:15:52 crc kubenswrapper[4776]: I0128 08:15:52.711026 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bfs7z_d6bbc905-926a-485a-a0da-4ac35f39505b/registry-server/0.log" Jan 28 08:15:52 crc kubenswrapper[4776]: I0128 08:15:52.767431 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-82xql_abb32b75-3393-47dc-a543-bfa5745c4ec6/extract-utilities/0.log" Jan 28 08:15:52 crc kubenswrapper[4776]: I0128 08:15:52.883444 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-82xql_abb32b75-3393-47dc-a543-bfa5745c4ec6/extract-content/0.log" Jan 28 08:15:52 crc kubenswrapper[4776]: I0128 08:15:52.916928 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-82xql_abb32b75-3393-47dc-a543-bfa5745c4ec6/extract-content/0.log" Jan 28 08:15:52 crc kubenswrapper[4776]: I0128 08:15:52.927030 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-82xql_abb32b75-3393-47dc-a543-bfa5745c4ec6/extract-utilities/0.log" Jan 28 08:15:53 crc kubenswrapper[4776]: I0128 08:15:53.133570 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-82xql_abb32b75-3393-47dc-a543-bfa5745c4ec6/extract-utilities/0.log" Jan 28 08:15:53 crc kubenswrapper[4776]: I0128 08:15:53.218406 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-82xql_abb32b75-3393-47dc-a543-bfa5745c4ec6/extract-content/0.log" Jan 28 08:15:53 crc kubenswrapper[4776]: I0128 08:15:53.388191 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-82xql_abb32b75-3393-47dc-a543-bfa5745c4ec6/registry-server/0.log" Jan 28 08:15:53 crc kubenswrapper[4776]: I0128 08:15:53.560471 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-45lfz_3d2605e3-4b9a-4dc8-8936-b209875dbdee/marketplace-operator/0.log" Jan 28 08:15:53 crc kubenswrapper[4776]: I0128 08:15:53.578905 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6fkfh_4972ca48-b9b2-4811-9d6a-15aef7b4a2c1/extract-utilities/0.log" Jan 28 08:15:53 crc kubenswrapper[4776]: I0128 08:15:53.771515 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6fkfh_4972ca48-b9b2-4811-9d6a-15aef7b4a2c1/extract-utilities/0.log" Jan 28 08:15:53 crc kubenswrapper[4776]: I0128 08:15:53.778030 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6fkfh_4972ca48-b9b2-4811-9d6a-15aef7b4a2c1/extract-content/0.log" Jan 28 08:15:53 crc kubenswrapper[4776]: I0128 08:15:53.806009 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6fkfh_4972ca48-b9b2-4811-9d6a-15aef7b4a2c1/extract-content/0.log" Jan 28 08:15:54 crc kubenswrapper[4776]: I0128 08:15:54.035711 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6fkfh_4972ca48-b9b2-4811-9d6a-15aef7b4a2c1/extract-content/0.log" Jan 28 08:15:54 crc kubenswrapper[4776]: I0128 08:15:54.060318 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6fkfh_4972ca48-b9b2-4811-9d6a-15aef7b4a2c1/extract-utilities/0.log" Jan 28 08:15:54 crc kubenswrapper[4776]: I0128 08:15:54.162285 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ft6p9_ae5aaed6-76ba-4b87-aafc-a96a98df7b3c/extract-utilities/0.log" Jan 28 08:15:54 crc kubenswrapper[4776]: I0128 08:15:54.169914 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6fkfh_4972ca48-b9b2-4811-9d6a-15aef7b4a2c1/registry-server/0.log" Jan 28 08:15:54 crc kubenswrapper[4776]: I0128 08:15:54.348832 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ft6p9_ae5aaed6-76ba-4b87-aafc-a96a98df7b3c/extract-content/0.log" Jan 28 08:15:54 crc kubenswrapper[4776]: I0128 08:15:54.369105 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ft6p9_ae5aaed6-76ba-4b87-aafc-a96a98df7b3c/extract-utilities/0.log" Jan 28 08:15:54 crc kubenswrapper[4776]: I0128 08:15:54.392194 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ft6p9_ae5aaed6-76ba-4b87-aafc-a96a98df7b3c/extract-content/0.log" Jan 28 08:15:54 crc kubenswrapper[4776]: I0128 08:15:54.561069 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ft6p9_ae5aaed6-76ba-4b87-aafc-a96a98df7b3c/extract-utilities/0.log" Jan 28 08:15:54 crc kubenswrapper[4776]: I0128 08:15:54.589158 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ft6p9_ae5aaed6-76ba-4b87-aafc-a96a98df7b3c/extract-content/0.log" Jan 28 08:15:55 crc kubenswrapper[4776]: I0128 08:15:55.110660 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ft6p9_ae5aaed6-76ba-4b87-aafc-a96a98df7b3c/registry-server/0.log" Jan 28 08:16:10 crc kubenswrapper[4776]: I0128 08:16:10.627996 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-9v6mf_633bf947-38aa-4444-911d-ea2f55433a93/prometheus-operator/0.log" Jan 28 08:16:10 crc kubenswrapper[4776]: I0128 08:16:10.652726 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp_f52dbbd1-d020-4074-93eb-706fff6e588b/prometheus-operator-admission-webhook/0.log" Jan 28 08:16:11 crc kubenswrapper[4776]: I0128 08:16:11.191831 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-qd6sh_f1812216-a0b3-4ae2-9c2c-7086dc74163b/perses-operator/0.log" Jan 28 08:16:11 crc kubenswrapper[4776]: I0128 08:16:11.206200 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf_194dde85-71e7-4d74-80c4-59e327ac851a/prometheus-operator-admission-webhook/0.log" Jan 28 08:16:11 crc kubenswrapper[4776]: I0128 08:16:11.230947 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-srqjl_215d3c95-e6d6-4022-a435-f6c30c630727/operator/0.log" Jan 28 08:16:17 crc kubenswrapper[4776]: E0128 08:16:17.245659 4776 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.195:38524->38.102.83.195:34377: write tcp 38.102.83.195:38524->38.102.83.195:34377: write: connection reset by peer Jan 28 08:17:05 crc kubenswrapper[4776]: I0128 08:17:05.731028 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-f68d4f57-8pgd6" podUID="10b029bb-8821-4602-9b1d-910d59efc97a" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.53:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 08:17:33 crc kubenswrapper[4776]: I0128 08:17:33.852961 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:17:33 crc kubenswrapper[4776]: I0128 08:17:33.853697 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:18:03 crc kubenswrapper[4776]: I0128 08:18:03.855017 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:18:03 crc kubenswrapper[4776]: I0128 08:18:03.855542 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:18:06 crc kubenswrapper[4776]: I0128 08:18:06.433043 4776 generic.go:334] "Generic (PLEG): container finished" podID="7bba7db5-580d-401d-9808-aab65fe407c1" containerID="bc94f5c27b83422783b5faa61dc09e4e024bff537428af9ee02a1fdd163e7cb2" exitCode=0 Jan 28 08:18:06 crc kubenswrapper[4776]: I0128 08:18:06.433204 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k85x2/must-gather-9pzlp" event={"ID":"7bba7db5-580d-401d-9808-aab65fe407c1","Type":"ContainerDied","Data":"bc94f5c27b83422783b5faa61dc09e4e024bff537428af9ee02a1fdd163e7cb2"} Jan 28 08:18:06 crc kubenswrapper[4776]: I0128 08:18:06.435878 4776 scope.go:117] "RemoveContainer" containerID="bc94f5c27b83422783b5faa61dc09e4e024bff537428af9ee02a1fdd163e7cb2" Jan 28 08:18:06 crc kubenswrapper[4776]: I0128 08:18:06.870826 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k85x2_must-gather-9pzlp_7bba7db5-580d-401d-9808-aab65fe407c1/gather/0.log" Jan 28 08:18:15 crc kubenswrapper[4776]: I0128 08:18:15.822711 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k85x2/must-gather-9pzlp"] Jan 28 08:18:15 crc kubenswrapper[4776]: I0128 08:18:15.823497 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-k85x2/must-gather-9pzlp" podUID="7bba7db5-580d-401d-9808-aab65fe407c1" containerName="copy" containerID="cri-o://b554297c21f1caef150deb3c82d34f0a3f9dae4e3c145daad7a1c08cc44fa0b7" gracePeriod=2 Jan 28 08:18:15 crc kubenswrapper[4776]: I0128 08:18:15.836284 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k85x2/must-gather-9pzlp"] Jan 28 08:18:16 crc kubenswrapper[4776]: I0128 08:18:16.318704 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k85x2_must-gather-9pzlp_7bba7db5-580d-401d-9808-aab65fe407c1/copy/0.log" Jan 28 08:18:16 crc kubenswrapper[4776]: I0128 08:18:16.319290 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k85x2/must-gather-9pzlp" Jan 28 08:18:16 crc kubenswrapper[4776]: I0128 08:18:16.482115 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7bba7db5-580d-401d-9808-aab65fe407c1-must-gather-output\") pod \"7bba7db5-580d-401d-9808-aab65fe407c1\" (UID: \"7bba7db5-580d-401d-9808-aab65fe407c1\") " Jan 28 08:18:16 crc kubenswrapper[4776]: I0128 08:18:16.482378 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpwdw\" (UniqueName: \"kubernetes.io/projected/7bba7db5-580d-401d-9808-aab65fe407c1-kube-api-access-fpwdw\") pod \"7bba7db5-580d-401d-9808-aab65fe407c1\" (UID: \"7bba7db5-580d-401d-9808-aab65fe407c1\") " Jan 28 08:18:16 crc kubenswrapper[4776]: I0128 08:18:16.491813 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bba7db5-580d-401d-9808-aab65fe407c1-kube-api-access-fpwdw" (OuterVolumeSpecName: "kube-api-access-fpwdw") pod "7bba7db5-580d-401d-9808-aab65fe407c1" (UID: "7bba7db5-580d-401d-9808-aab65fe407c1"). InnerVolumeSpecName "kube-api-access-fpwdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:18:16 crc kubenswrapper[4776]: I0128 08:18:16.555131 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k85x2_must-gather-9pzlp_7bba7db5-580d-401d-9808-aab65fe407c1/copy/0.log" Jan 28 08:18:16 crc kubenswrapper[4776]: I0128 08:18:16.558805 4776 generic.go:334] "Generic (PLEG): container finished" podID="7bba7db5-580d-401d-9808-aab65fe407c1" containerID="b554297c21f1caef150deb3c82d34f0a3f9dae4e3c145daad7a1c08cc44fa0b7" exitCode=143 Jan 28 08:18:16 crc kubenswrapper[4776]: I0128 08:18:16.558869 4776 scope.go:117] "RemoveContainer" containerID="b554297c21f1caef150deb3c82d34f0a3f9dae4e3c145daad7a1c08cc44fa0b7" Jan 28 08:18:16 crc kubenswrapper[4776]: I0128 08:18:16.559032 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k85x2/must-gather-9pzlp" Jan 28 08:18:16 crc kubenswrapper[4776]: I0128 08:18:16.583223 4776 scope.go:117] "RemoveContainer" containerID="bc94f5c27b83422783b5faa61dc09e4e024bff537428af9ee02a1fdd163e7cb2" Jan 28 08:18:16 crc kubenswrapper[4776]: I0128 08:18:16.585084 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpwdw\" (UniqueName: \"kubernetes.io/projected/7bba7db5-580d-401d-9808-aab65fe407c1-kube-api-access-fpwdw\") on node \"crc\" DevicePath \"\"" Jan 28 08:18:16 crc kubenswrapper[4776]: I0128 08:18:16.660872 4776 scope.go:117] "RemoveContainer" containerID="b554297c21f1caef150deb3c82d34f0a3f9dae4e3c145daad7a1c08cc44fa0b7" Jan 28 08:18:16 crc kubenswrapper[4776]: E0128 08:18:16.661719 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b554297c21f1caef150deb3c82d34f0a3f9dae4e3c145daad7a1c08cc44fa0b7\": container with ID starting with b554297c21f1caef150deb3c82d34f0a3f9dae4e3c145daad7a1c08cc44fa0b7 not found: ID does not exist" containerID="b554297c21f1caef150deb3c82d34f0a3f9dae4e3c145daad7a1c08cc44fa0b7" Jan 28 08:18:16 crc kubenswrapper[4776]: I0128 08:18:16.661772 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b554297c21f1caef150deb3c82d34f0a3f9dae4e3c145daad7a1c08cc44fa0b7"} err="failed to get container status \"b554297c21f1caef150deb3c82d34f0a3f9dae4e3c145daad7a1c08cc44fa0b7\": rpc error: code = NotFound desc = could not find container \"b554297c21f1caef150deb3c82d34f0a3f9dae4e3c145daad7a1c08cc44fa0b7\": container with ID starting with b554297c21f1caef150deb3c82d34f0a3f9dae4e3c145daad7a1c08cc44fa0b7 not found: ID does not exist" Jan 28 08:18:16 crc kubenswrapper[4776]: I0128 08:18:16.661807 4776 scope.go:117] "RemoveContainer" containerID="bc94f5c27b83422783b5faa61dc09e4e024bff537428af9ee02a1fdd163e7cb2" Jan 28 08:18:16 crc kubenswrapper[4776]: E0128 08:18:16.662241 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc94f5c27b83422783b5faa61dc09e4e024bff537428af9ee02a1fdd163e7cb2\": container with ID starting with bc94f5c27b83422783b5faa61dc09e4e024bff537428af9ee02a1fdd163e7cb2 not found: ID does not exist" containerID="bc94f5c27b83422783b5faa61dc09e4e024bff537428af9ee02a1fdd163e7cb2" Jan 28 08:18:16 crc kubenswrapper[4776]: I0128 08:18:16.662296 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc94f5c27b83422783b5faa61dc09e4e024bff537428af9ee02a1fdd163e7cb2"} err="failed to get container status \"bc94f5c27b83422783b5faa61dc09e4e024bff537428af9ee02a1fdd163e7cb2\": rpc error: code = NotFound desc = could not find container \"bc94f5c27b83422783b5faa61dc09e4e024bff537428af9ee02a1fdd163e7cb2\": container with ID starting with bc94f5c27b83422783b5faa61dc09e4e024bff537428af9ee02a1fdd163e7cb2 not found: ID does not exist" Jan 28 08:18:16 crc kubenswrapper[4776]: I0128 08:18:16.665408 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bba7db5-580d-401d-9808-aab65fe407c1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7bba7db5-580d-401d-9808-aab65fe407c1" (UID: "7bba7db5-580d-401d-9808-aab65fe407c1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:18:16 crc kubenswrapper[4776]: I0128 08:18:16.686803 4776 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7bba7db5-580d-401d-9808-aab65fe407c1-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 28 08:18:17 crc kubenswrapper[4776]: I0128 08:18:17.314572 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bba7db5-580d-401d-9808-aab65fe407c1" path="/var/lib/kubelet/pods/7bba7db5-580d-401d-9808-aab65fe407c1/volumes" Jan 28 08:18:28 crc kubenswrapper[4776]: I0128 08:18:28.964326 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d2pqb"] Jan 28 08:18:28 crc kubenswrapper[4776]: E0128 08:18:28.965638 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bba7db5-580d-401d-9808-aab65fe407c1" containerName="copy" Jan 28 08:18:28 crc kubenswrapper[4776]: I0128 08:18:28.965660 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bba7db5-580d-401d-9808-aab65fe407c1" containerName="copy" Jan 28 08:18:28 crc kubenswrapper[4776]: E0128 08:18:28.965699 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bba7db5-580d-401d-9808-aab65fe407c1" containerName="gather" Jan 28 08:18:28 crc kubenswrapper[4776]: I0128 08:18:28.965713 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bba7db5-580d-401d-9808-aab65fe407c1" containerName="gather" Jan 28 08:18:28 crc kubenswrapper[4776]: E0128 08:18:28.965742 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bfcdacc-09be-4a5c-8a9e-5c730adb19fa" containerName="collect-profiles" Jan 28 08:18:28 crc kubenswrapper[4776]: I0128 08:18:28.965756 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bfcdacc-09be-4a5c-8a9e-5c730adb19fa" containerName="collect-profiles" Jan 28 08:18:28 crc kubenswrapper[4776]: I0128 08:18:28.966121 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bba7db5-580d-401d-9808-aab65fe407c1" containerName="gather" Jan 28 08:18:28 crc kubenswrapper[4776]: I0128 08:18:28.966158 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bfcdacc-09be-4a5c-8a9e-5c730adb19fa" containerName="collect-profiles" Jan 28 08:18:28 crc kubenswrapper[4776]: I0128 08:18:28.966212 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bba7db5-580d-401d-9808-aab65fe407c1" containerName="copy" Jan 28 08:18:28 crc kubenswrapper[4776]: I0128 08:18:28.968858 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d2pqb" Jan 28 08:18:28 crc kubenswrapper[4776]: I0128 08:18:28.977606 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d2pqb"] Jan 28 08:18:29 crc kubenswrapper[4776]: I0128 08:18:29.151378 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ada7d8d-f10a-4f78-8222-5116f3487f9e-catalog-content\") pod \"community-operators-d2pqb\" (UID: \"6ada7d8d-f10a-4f78-8222-5116f3487f9e\") " pod="openshift-marketplace/community-operators-d2pqb" Jan 28 08:18:29 crc kubenswrapper[4776]: I0128 08:18:29.151422 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fvkq\" (UniqueName: \"kubernetes.io/projected/6ada7d8d-f10a-4f78-8222-5116f3487f9e-kube-api-access-9fvkq\") pod \"community-operators-d2pqb\" (UID: \"6ada7d8d-f10a-4f78-8222-5116f3487f9e\") " pod="openshift-marketplace/community-operators-d2pqb" Jan 28 08:18:29 crc kubenswrapper[4776]: I0128 08:18:29.151627 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ada7d8d-f10a-4f78-8222-5116f3487f9e-utilities\") pod \"community-operators-d2pqb\" (UID: \"6ada7d8d-f10a-4f78-8222-5116f3487f9e\") " pod="openshift-marketplace/community-operators-d2pqb" Jan 28 08:18:29 crc kubenswrapper[4776]: I0128 08:18:29.253592 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ada7d8d-f10a-4f78-8222-5116f3487f9e-catalog-content\") pod \"community-operators-d2pqb\" (UID: \"6ada7d8d-f10a-4f78-8222-5116f3487f9e\") " pod="openshift-marketplace/community-operators-d2pqb" Jan 28 08:18:29 crc kubenswrapper[4776]: I0128 08:18:29.253644 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fvkq\" (UniqueName: \"kubernetes.io/projected/6ada7d8d-f10a-4f78-8222-5116f3487f9e-kube-api-access-9fvkq\") pod \"community-operators-d2pqb\" (UID: \"6ada7d8d-f10a-4f78-8222-5116f3487f9e\") " pod="openshift-marketplace/community-operators-d2pqb" Jan 28 08:18:29 crc kubenswrapper[4776]: I0128 08:18:29.253719 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ada7d8d-f10a-4f78-8222-5116f3487f9e-utilities\") pod \"community-operators-d2pqb\" (UID: \"6ada7d8d-f10a-4f78-8222-5116f3487f9e\") " pod="openshift-marketplace/community-operators-d2pqb" Jan 28 08:18:29 crc kubenswrapper[4776]: I0128 08:18:29.254231 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ada7d8d-f10a-4f78-8222-5116f3487f9e-utilities\") pod \"community-operators-d2pqb\" (UID: \"6ada7d8d-f10a-4f78-8222-5116f3487f9e\") " pod="openshift-marketplace/community-operators-d2pqb" Jan 28 08:18:29 crc kubenswrapper[4776]: I0128 08:18:29.254242 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ada7d8d-f10a-4f78-8222-5116f3487f9e-catalog-content\") pod \"community-operators-d2pqb\" (UID: \"6ada7d8d-f10a-4f78-8222-5116f3487f9e\") " pod="openshift-marketplace/community-operators-d2pqb" Jan 28 08:18:29 crc kubenswrapper[4776]: I0128 08:18:29.280112 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fvkq\" (UniqueName: \"kubernetes.io/projected/6ada7d8d-f10a-4f78-8222-5116f3487f9e-kube-api-access-9fvkq\") pod \"community-operators-d2pqb\" (UID: \"6ada7d8d-f10a-4f78-8222-5116f3487f9e\") " pod="openshift-marketplace/community-operators-d2pqb" Jan 28 08:18:29 crc kubenswrapper[4776]: I0128 08:18:29.300258 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d2pqb" Jan 28 08:18:29 crc kubenswrapper[4776]: I0128 08:18:29.850677 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d2pqb"] Jan 28 08:18:30 crc kubenswrapper[4776]: I0128 08:18:30.724519 4776 generic.go:334] "Generic (PLEG): container finished" podID="6ada7d8d-f10a-4f78-8222-5116f3487f9e" containerID="22531389977b5f81b7256d42163cbe0e2c7b39878fa1203b2260b764b244ec9f" exitCode=0 Jan 28 08:18:30 crc kubenswrapper[4776]: I0128 08:18:30.724578 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2pqb" event={"ID":"6ada7d8d-f10a-4f78-8222-5116f3487f9e","Type":"ContainerDied","Data":"22531389977b5f81b7256d42163cbe0e2c7b39878fa1203b2260b764b244ec9f"} Jan 28 08:18:30 crc kubenswrapper[4776]: I0128 08:18:30.724850 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2pqb" event={"ID":"6ada7d8d-f10a-4f78-8222-5116f3487f9e","Type":"ContainerStarted","Data":"c9463c2ea0c72af92899a090fa3ba1877c70aac61b8f90cb3df38236fa093867"} Jan 28 08:18:31 crc kubenswrapper[4776]: I0128 08:18:31.740021 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2pqb" event={"ID":"6ada7d8d-f10a-4f78-8222-5116f3487f9e","Type":"ContainerStarted","Data":"39f605025d6e90af963adccdbc5d87e57fbf85f202a83003234c0dc208133033"} Jan 28 08:18:32 crc kubenswrapper[4776]: I0128 08:18:32.749751 4776 generic.go:334] "Generic (PLEG): container finished" podID="6ada7d8d-f10a-4f78-8222-5116f3487f9e" containerID="39f605025d6e90af963adccdbc5d87e57fbf85f202a83003234c0dc208133033" exitCode=0 Jan 28 08:18:32 crc kubenswrapper[4776]: I0128 08:18:32.749785 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2pqb" event={"ID":"6ada7d8d-f10a-4f78-8222-5116f3487f9e","Type":"ContainerDied","Data":"39f605025d6e90af963adccdbc5d87e57fbf85f202a83003234c0dc208133033"} Jan 28 08:18:33 crc kubenswrapper[4776]: I0128 08:18:33.765506 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2pqb" event={"ID":"6ada7d8d-f10a-4f78-8222-5116f3487f9e","Type":"ContainerStarted","Data":"fb1a7031b793a77ce382029c5dacc63549fb219becffd15f2fc361c15dd8c60d"} Jan 28 08:18:33 crc kubenswrapper[4776]: I0128 08:18:33.791607 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d2pqb" podStartSLOduration=3.374465475 podStartE2EDuration="5.791587023s" podCreationTimestamp="2026-01-28 08:18:28 +0000 UTC" firstStartedPulling="2026-01-28 08:18:30.72674136 +0000 UTC m=+5282.142401520" lastFinishedPulling="2026-01-28 08:18:33.143862868 +0000 UTC m=+5284.559523068" observedRunningTime="2026-01-28 08:18:33.791145502 +0000 UTC m=+5285.206805692" watchObservedRunningTime="2026-01-28 08:18:33.791587023 +0000 UTC m=+5285.207247193" Jan 28 08:18:33 crc kubenswrapper[4776]: I0128 08:18:33.852663 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:18:33 crc kubenswrapper[4776]: I0128 08:18:33.852750 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:18:33 crc kubenswrapper[4776]: I0128 08:18:33.852799 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 08:18:33 crc kubenswrapper[4776]: I0128 08:18:33.853598 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 08:18:33 crc kubenswrapper[4776]: I0128 08:18:33.853744 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" gracePeriod=600 Jan 28 08:18:33 crc kubenswrapper[4776]: E0128 08:18:33.975951 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:18:34 crc kubenswrapper[4776]: I0128 08:18:34.776956 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" exitCode=0 Jan 28 08:18:34 crc kubenswrapper[4776]: I0128 08:18:34.777132 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1"} Jan 28 08:18:34 crc kubenswrapper[4776]: I0128 08:18:34.777719 4776 scope.go:117] "RemoveContainer" containerID="7218f61cf76b54d31c01362ba5acfec5b8d9c04a058abe1ab823351fb2870817" Jan 28 08:18:34 crc kubenswrapper[4776]: I0128 08:18:34.778366 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:18:34 crc kubenswrapper[4776]: E0128 08:18:34.778605 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:18:39 crc kubenswrapper[4776]: I0128 08:18:39.302023 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d2pqb" Jan 28 08:18:39 crc kubenswrapper[4776]: I0128 08:18:39.302698 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d2pqb" Jan 28 08:18:39 crc kubenswrapper[4776]: I0128 08:18:39.748305 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d2pqb" Jan 28 08:18:39 crc kubenswrapper[4776]: I0128 08:18:39.907870 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d2pqb" Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.108659 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d2pqb"] Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.109122 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d2pqb" podUID="6ada7d8d-f10a-4f78-8222-5116f3487f9e" containerName="registry-server" containerID="cri-o://fb1a7031b793a77ce382029c5dacc63549fb219becffd15f2fc361c15dd8c60d" gracePeriod=2 Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.599743 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d2pqb" Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.744277 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ada7d8d-f10a-4f78-8222-5116f3487f9e-catalog-content\") pod \"6ada7d8d-f10a-4f78-8222-5116f3487f9e\" (UID: \"6ada7d8d-f10a-4f78-8222-5116f3487f9e\") " Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.744368 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fvkq\" (UniqueName: \"kubernetes.io/projected/6ada7d8d-f10a-4f78-8222-5116f3487f9e-kube-api-access-9fvkq\") pod \"6ada7d8d-f10a-4f78-8222-5116f3487f9e\" (UID: \"6ada7d8d-f10a-4f78-8222-5116f3487f9e\") " Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.744578 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ada7d8d-f10a-4f78-8222-5116f3487f9e-utilities\") pod \"6ada7d8d-f10a-4f78-8222-5116f3487f9e\" (UID: \"6ada7d8d-f10a-4f78-8222-5116f3487f9e\") " Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.746163 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ada7d8d-f10a-4f78-8222-5116f3487f9e-utilities" (OuterVolumeSpecName: "utilities") pod "6ada7d8d-f10a-4f78-8222-5116f3487f9e" (UID: "6ada7d8d-f10a-4f78-8222-5116f3487f9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.753991 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ada7d8d-f10a-4f78-8222-5116f3487f9e-kube-api-access-9fvkq" (OuterVolumeSpecName: "kube-api-access-9fvkq") pod "6ada7d8d-f10a-4f78-8222-5116f3487f9e" (UID: "6ada7d8d-f10a-4f78-8222-5116f3487f9e"). InnerVolumeSpecName "kube-api-access-9fvkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.805859 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ada7d8d-f10a-4f78-8222-5116f3487f9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ada7d8d-f10a-4f78-8222-5116f3487f9e" (UID: "6ada7d8d-f10a-4f78-8222-5116f3487f9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.846441 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fvkq\" (UniqueName: \"kubernetes.io/projected/6ada7d8d-f10a-4f78-8222-5116f3487f9e-kube-api-access-9fvkq\") on node \"crc\" DevicePath \"\"" Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.846670 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ada7d8d-f10a-4f78-8222-5116f3487f9e-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.846737 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ada7d8d-f10a-4f78-8222-5116f3487f9e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.878502 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d2pqb" Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.878799 4776 generic.go:334] "Generic (PLEG): container finished" podID="6ada7d8d-f10a-4f78-8222-5116f3487f9e" containerID="fb1a7031b793a77ce382029c5dacc63549fb219becffd15f2fc361c15dd8c60d" exitCode=0 Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.878811 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2pqb" event={"ID":"6ada7d8d-f10a-4f78-8222-5116f3487f9e","Type":"ContainerDied","Data":"fb1a7031b793a77ce382029c5dacc63549fb219becffd15f2fc361c15dd8c60d"} Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.878876 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2pqb" event={"ID":"6ada7d8d-f10a-4f78-8222-5116f3487f9e","Type":"ContainerDied","Data":"c9463c2ea0c72af92899a090fa3ba1877c70aac61b8f90cb3df38236fa093867"} Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.878907 4776 scope.go:117] "RemoveContainer" containerID="fb1a7031b793a77ce382029c5dacc63549fb219becffd15f2fc361c15dd8c60d" Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.914696 4776 scope.go:117] "RemoveContainer" containerID="39f605025d6e90af963adccdbc5d87e57fbf85f202a83003234c0dc208133033" Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.918366 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d2pqb"] Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.932044 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d2pqb"] Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.947522 4776 scope.go:117] "RemoveContainer" containerID="22531389977b5f81b7256d42163cbe0e2c7b39878fa1203b2260b764b244ec9f" Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.989389 4776 scope.go:117] "RemoveContainer" containerID="fb1a7031b793a77ce382029c5dacc63549fb219becffd15f2fc361c15dd8c60d" Jan 28 08:18:42 crc kubenswrapper[4776]: E0128 08:18:42.989927 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb1a7031b793a77ce382029c5dacc63549fb219becffd15f2fc361c15dd8c60d\": container with ID starting with fb1a7031b793a77ce382029c5dacc63549fb219becffd15f2fc361c15dd8c60d not found: ID does not exist" containerID="fb1a7031b793a77ce382029c5dacc63549fb219becffd15f2fc361c15dd8c60d" Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.990005 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb1a7031b793a77ce382029c5dacc63549fb219becffd15f2fc361c15dd8c60d"} err="failed to get container status \"fb1a7031b793a77ce382029c5dacc63549fb219becffd15f2fc361c15dd8c60d\": rpc error: code = NotFound desc = could not find container \"fb1a7031b793a77ce382029c5dacc63549fb219becffd15f2fc361c15dd8c60d\": container with ID starting with fb1a7031b793a77ce382029c5dacc63549fb219becffd15f2fc361c15dd8c60d not found: ID does not exist" Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.990060 4776 scope.go:117] "RemoveContainer" containerID="39f605025d6e90af963adccdbc5d87e57fbf85f202a83003234c0dc208133033" Jan 28 08:18:42 crc kubenswrapper[4776]: E0128 08:18:42.990487 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39f605025d6e90af963adccdbc5d87e57fbf85f202a83003234c0dc208133033\": container with ID starting with 39f605025d6e90af963adccdbc5d87e57fbf85f202a83003234c0dc208133033 not found: ID does not exist" containerID="39f605025d6e90af963adccdbc5d87e57fbf85f202a83003234c0dc208133033" Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.990641 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39f605025d6e90af963adccdbc5d87e57fbf85f202a83003234c0dc208133033"} err="failed to get container status \"39f605025d6e90af963adccdbc5d87e57fbf85f202a83003234c0dc208133033\": rpc error: code = NotFound desc = could not find container \"39f605025d6e90af963adccdbc5d87e57fbf85f202a83003234c0dc208133033\": container with ID starting with 39f605025d6e90af963adccdbc5d87e57fbf85f202a83003234c0dc208133033 not found: ID does not exist" Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.990688 4776 scope.go:117] "RemoveContainer" containerID="22531389977b5f81b7256d42163cbe0e2c7b39878fa1203b2260b764b244ec9f" Jan 28 08:18:42 crc kubenswrapper[4776]: E0128 08:18:42.991608 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22531389977b5f81b7256d42163cbe0e2c7b39878fa1203b2260b764b244ec9f\": container with ID starting with 22531389977b5f81b7256d42163cbe0e2c7b39878fa1203b2260b764b244ec9f not found: ID does not exist" containerID="22531389977b5f81b7256d42163cbe0e2c7b39878fa1203b2260b764b244ec9f" Jan 28 08:18:42 crc kubenswrapper[4776]: I0128 08:18:42.991670 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22531389977b5f81b7256d42163cbe0e2c7b39878fa1203b2260b764b244ec9f"} err="failed to get container status \"22531389977b5f81b7256d42163cbe0e2c7b39878fa1203b2260b764b244ec9f\": rpc error: code = NotFound desc = could not find container \"22531389977b5f81b7256d42163cbe0e2c7b39878fa1203b2260b764b244ec9f\": container with ID starting with 22531389977b5f81b7256d42163cbe0e2c7b39878fa1203b2260b764b244ec9f not found: ID does not exist" Jan 28 08:18:43 crc kubenswrapper[4776]: I0128 08:18:43.324149 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ada7d8d-f10a-4f78-8222-5116f3487f9e" path="/var/lib/kubelet/pods/6ada7d8d-f10a-4f78-8222-5116f3487f9e/volumes" Jan 28 08:18:47 crc kubenswrapper[4776]: I0128 08:18:47.305024 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:18:47 crc kubenswrapper[4776]: E0128 08:18:47.305778 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:18:50 crc kubenswrapper[4776]: I0128 08:18:50.411620 4776 scope.go:117] "RemoveContainer" containerID="99bcd5f5bac1d7d7140008d96d8255040ea0902bb12c4e7433499d7a1e4a6c2b" Jan 28 08:19:00 crc kubenswrapper[4776]: I0128 08:19:00.305184 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:19:00 crc kubenswrapper[4776]: E0128 08:19:00.306455 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:19:11 crc kubenswrapper[4776]: I0128 08:19:11.305047 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:19:11 crc kubenswrapper[4776]: E0128 08:19:11.305979 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:19:23 crc kubenswrapper[4776]: I0128 08:19:23.304927 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:19:23 crc kubenswrapper[4776]: E0128 08:19:23.305958 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.079406 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s4w5t"] Jan 28 08:19:25 crc kubenswrapper[4776]: E0128 08:19:25.080717 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ada7d8d-f10a-4f78-8222-5116f3487f9e" containerName="extract-content" Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.080782 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ada7d8d-f10a-4f78-8222-5116f3487f9e" containerName="extract-content" Jan 28 08:19:25 crc kubenswrapper[4776]: E0128 08:19:25.080842 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ada7d8d-f10a-4f78-8222-5116f3487f9e" containerName="registry-server" Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.080856 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ada7d8d-f10a-4f78-8222-5116f3487f9e" containerName="registry-server" Jan 28 08:19:25 crc kubenswrapper[4776]: E0128 08:19:25.080900 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ada7d8d-f10a-4f78-8222-5116f3487f9e" containerName="extract-utilities" Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.080914 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ada7d8d-f10a-4f78-8222-5116f3487f9e" containerName="extract-utilities" Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.081274 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ada7d8d-f10a-4f78-8222-5116f3487f9e" containerName="registry-server" Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.083743 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s4w5t" Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.088194 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s4w5t"] Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.122428 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e28ba4-3e71-41e2-8b83-11a5756227f1-catalog-content\") pod \"redhat-operators-s4w5t\" (UID: \"50e28ba4-3e71-41e2-8b83-11a5756227f1\") " pod="openshift-marketplace/redhat-operators-s4w5t" Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.122560 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e28ba4-3e71-41e2-8b83-11a5756227f1-utilities\") pod \"redhat-operators-s4w5t\" (UID: \"50e28ba4-3e71-41e2-8b83-11a5756227f1\") " pod="openshift-marketplace/redhat-operators-s4w5t" Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.122582 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm52b\" (UniqueName: \"kubernetes.io/projected/50e28ba4-3e71-41e2-8b83-11a5756227f1-kube-api-access-tm52b\") pod \"redhat-operators-s4w5t\" (UID: \"50e28ba4-3e71-41e2-8b83-11a5756227f1\") " pod="openshift-marketplace/redhat-operators-s4w5t" Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.224947 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm52b\" (UniqueName: \"kubernetes.io/projected/50e28ba4-3e71-41e2-8b83-11a5756227f1-kube-api-access-tm52b\") pod \"redhat-operators-s4w5t\" (UID: \"50e28ba4-3e71-41e2-8b83-11a5756227f1\") " pod="openshift-marketplace/redhat-operators-s4w5t" Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.225095 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e28ba4-3e71-41e2-8b83-11a5756227f1-catalog-content\") pod \"redhat-operators-s4w5t\" (UID: \"50e28ba4-3e71-41e2-8b83-11a5756227f1\") " pod="openshift-marketplace/redhat-operators-s4w5t" Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.225229 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e28ba4-3e71-41e2-8b83-11a5756227f1-utilities\") pod \"redhat-operators-s4w5t\" (UID: \"50e28ba4-3e71-41e2-8b83-11a5756227f1\") " pod="openshift-marketplace/redhat-operators-s4w5t" Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.225813 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e28ba4-3e71-41e2-8b83-11a5756227f1-utilities\") pod \"redhat-operators-s4w5t\" (UID: \"50e28ba4-3e71-41e2-8b83-11a5756227f1\") " pod="openshift-marketplace/redhat-operators-s4w5t" Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.225873 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e28ba4-3e71-41e2-8b83-11a5756227f1-catalog-content\") pod \"redhat-operators-s4w5t\" (UID: \"50e28ba4-3e71-41e2-8b83-11a5756227f1\") " pod="openshift-marketplace/redhat-operators-s4w5t" Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.249056 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm52b\" (UniqueName: \"kubernetes.io/projected/50e28ba4-3e71-41e2-8b83-11a5756227f1-kube-api-access-tm52b\") pod \"redhat-operators-s4w5t\" (UID: \"50e28ba4-3e71-41e2-8b83-11a5756227f1\") " pod="openshift-marketplace/redhat-operators-s4w5t" Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.424222 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s4w5t" Jan 28 08:19:25 crc kubenswrapper[4776]: I0128 08:19:25.881201 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s4w5t"] Jan 28 08:19:26 crc kubenswrapper[4776]: I0128 08:19:26.394469 4776 generic.go:334] "Generic (PLEG): container finished" podID="50e28ba4-3e71-41e2-8b83-11a5756227f1" containerID="e000c40e00cdb0721663c9f10a34e29b92c30e218c7806b93f9c2b3cf4891b05" exitCode=0 Jan 28 08:19:26 crc kubenswrapper[4776]: I0128 08:19:26.394588 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4w5t" event={"ID":"50e28ba4-3e71-41e2-8b83-11a5756227f1","Type":"ContainerDied","Data":"e000c40e00cdb0721663c9f10a34e29b92c30e218c7806b93f9c2b3cf4891b05"} Jan 28 08:19:26 crc kubenswrapper[4776]: I0128 08:19:26.394895 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4w5t" event={"ID":"50e28ba4-3e71-41e2-8b83-11a5756227f1","Type":"ContainerStarted","Data":"2943451a54356cfeeed6cf76397f629b89abfd65b1aa3f89fef3eb72a162c17b"} Jan 28 08:19:27 crc kubenswrapper[4776]: I0128 08:19:27.404451 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4w5t" event={"ID":"50e28ba4-3e71-41e2-8b83-11a5756227f1","Type":"ContainerStarted","Data":"8be4a4f89d8d906172c270ce888b7b86bec3e33e61a318c3e711697159929fbb"} Jan 28 08:19:30 crc kubenswrapper[4776]: I0128 08:19:30.433714 4776 generic.go:334] "Generic (PLEG): container finished" podID="50e28ba4-3e71-41e2-8b83-11a5756227f1" containerID="8be4a4f89d8d906172c270ce888b7b86bec3e33e61a318c3e711697159929fbb" exitCode=0 Jan 28 08:19:30 crc kubenswrapper[4776]: I0128 08:19:30.433795 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4w5t" event={"ID":"50e28ba4-3e71-41e2-8b83-11a5756227f1","Type":"ContainerDied","Data":"8be4a4f89d8d906172c270ce888b7b86bec3e33e61a318c3e711697159929fbb"} Jan 28 08:19:31 crc kubenswrapper[4776]: I0128 08:19:31.448751 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4w5t" event={"ID":"50e28ba4-3e71-41e2-8b83-11a5756227f1","Type":"ContainerStarted","Data":"986200e1c4be78954339b7b73a155b12c3e63611361dbdef36dda8c571a75c89"} Jan 28 08:19:31 crc kubenswrapper[4776]: I0128 08:19:31.476290 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s4w5t" podStartSLOduration=1.945239079 podStartE2EDuration="6.476273669s" podCreationTimestamp="2026-01-28 08:19:25 +0000 UTC" firstStartedPulling="2026-01-28 08:19:26.397442317 +0000 UTC m=+5337.813102477" lastFinishedPulling="2026-01-28 08:19:30.928476897 +0000 UTC m=+5342.344137067" observedRunningTime="2026-01-28 08:19:31.475191719 +0000 UTC m=+5342.890851899" watchObservedRunningTime="2026-01-28 08:19:31.476273669 +0000 UTC m=+5342.891933829" Jan 28 08:19:35 crc kubenswrapper[4776]: I0128 08:19:35.304677 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:19:35 crc kubenswrapper[4776]: E0128 08:19:35.305963 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:19:35 crc kubenswrapper[4776]: I0128 08:19:35.425297 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s4w5t" Jan 28 08:19:35 crc kubenswrapper[4776]: I0128 08:19:35.425340 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s4w5t" Jan 28 08:19:36 crc kubenswrapper[4776]: I0128 08:19:36.500589 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s4w5t" podUID="50e28ba4-3e71-41e2-8b83-11a5756227f1" containerName="registry-server" probeResult="failure" output=< Jan 28 08:19:36 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Jan 28 08:19:36 crc kubenswrapper[4776]: > Jan 28 08:19:45 crc kubenswrapper[4776]: I0128 08:19:45.490642 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s4w5t" Jan 28 08:19:45 crc kubenswrapper[4776]: I0128 08:19:45.566841 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s4w5t" Jan 28 08:19:45 crc kubenswrapper[4776]: I0128 08:19:45.742487 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s4w5t"] Jan 28 08:19:46 crc kubenswrapper[4776]: I0128 08:19:46.618052 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s4w5t" podUID="50e28ba4-3e71-41e2-8b83-11a5756227f1" containerName="registry-server" containerID="cri-o://986200e1c4be78954339b7b73a155b12c3e63611361dbdef36dda8c571a75c89" gracePeriod=2 Jan 28 08:19:47 crc kubenswrapper[4776]: I0128 08:19:47.632427 4776 generic.go:334] "Generic (PLEG): container finished" podID="50e28ba4-3e71-41e2-8b83-11a5756227f1" containerID="986200e1c4be78954339b7b73a155b12c3e63611361dbdef36dda8c571a75c89" exitCode=0 Jan 28 08:19:47 crc kubenswrapper[4776]: I0128 08:19:47.632814 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4w5t" event={"ID":"50e28ba4-3e71-41e2-8b83-11a5756227f1","Type":"ContainerDied","Data":"986200e1c4be78954339b7b73a155b12c3e63611361dbdef36dda8c571a75c89"} Jan 28 08:19:47 crc kubenswrapper[4776]: I0128 08:19:47.792106 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s4w5t" Jan 28 08:19:47 crc kubenswrapper[4776]: I0128 08:19:47.918983 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e28ba4-3e71-41e2-8b83-11a5756227f1-catalog-content\") pod \"50e28ba4-3e71-41e2-8b83-11a5756227f1\" (UID: \"50e28ba4-3e71-41e2-8b83-11a5756227f1\") " Jan 28 08:19:47 crc kubenswrapper[4776]: I0128 08:19:47.919200 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm52b\" (UniqueName: \"kubernetes.io/projected/50e28ba4-3e71-41e2-8b83-11a5756227f1-kube-api-access-tm52b\") pod \"50e28ba4-3e71-41e2-8b83-11a5756227f1\" (UID: \"50e28ba4-3e71-41e2-8b83-11a5756227f1\") " Jan 28 08:19:47 crc kubenswrapper[4776]: I0128 08:19:47.919332 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e28ba4-3e71-41e2-8b83-11a5756227f1-utilities\") pod \"50e28ba4-3e71-41e2-8b83-11a5756227f1\" (UID: \"50e28ba4-3e71-41e2-8b83-11a5756227f1\") " Jan 28 08:19:47 crc kubenswrapper[4776]: I0128 08:19:47.920290 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50e28ba4-3e71-41e2-8b83-11a5756227f1-utilities" (OuterVolumeSpecName: "utilities") pod "50e28ba4-3e71-41e2-8b83-11a5756227f1" (UID: "50e28ba4-3e71-41e2-8b83-11a5756227f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:19:47 crc kubenswrapper[4776]: I0128 08:19:47.927970 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e28ba4-3e71-41e2-8b83-11a5756227f1-kube-api-access-tm52b" (OuterVolumeSpecName: "kube-api-access-tm52b") pod "50e28ba4-3e71-41e2-8b83-11a5756227f1" (UID: "50e28ba4-3e71-41e2-8b83-11a5756227f1"). InnerVolumeSpecName "kube-api-access-tm52b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:19:48 crc kubenswrapper[4776]: I0128 08:19:48.022286 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e28ba4-3e71-41e2-8b83-11a5756227f1-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 08:19:48 crc kubenswrapper[4776]: I0128 08:19:48.022342 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm52b\" (UniqueName: \"kubernetes.io/projected/50e28ba4-3e71-41e2-8b83-11a5756227f1-kube-api-access-tm52b\") on node \"crc\" DevicePath \"\"" Jan 28 08:19:48 crc kubenswrapper[4776]: I0128 08:19:48.046115 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50e28ba4-3e71-41e2-8b83-11a5756227f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50e28ba4-3e71-41e2-8b83-11a5756227f1" (UID: "50e28ba4-3e71-41e2-8b83-11a5756227f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:19:48 crc kubenswrapper[4776]: I0128 08:19:48.124056 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e28ba4-3e71-41e2-8b83-11a5756227f1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 08:19:48 crc kubenswrapper[4776]: I0128 08:19:48.306426 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:19:48 crc kubenswrapper[4776]: E0128 08:19:48.306876 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:19:48 crc kubenswrapper[4776]: I0128 08:19:48.651310 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4w5t" event={"ID":"50e28ba4-3e71-41e2-8b83-11a5756227f1","Type":"ContainerDied","Data":"2943451a54356cfeeed6cf76397f629b89abfd65b1aa3f89fef3eb72a162c17b"} Jan 28 08:19:48 crc kubenswrapper[4776]: I0128 08:19:48.651395 4776 scope.go:117] "RemoveContainer" containerID="986200e1c4be78954339b7b73a155b12c3e63611361dbdef36dda8c571a75c89" Jan 28 08:19:48 crc kubenswrapper[4776]: I0128 08:19:48.651456 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s4w5t" Jan 28 08:19:48 crc kubenswrapper[4776]: I0128 08:19:48.707904 4776 scope.go:117] "RemoveContainer" containerID="8be4a4f89d8d906172c270ce888b7b86bec3e33e61a318c3e711697159929fbb" Jan 28 08:19:48 crc kubenswrapper[4776]: I0128 08:19:48.729072 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s4w5t"] Jan 28 08:19:48 crc kubenswrapper[4776]: I0128 08:19:48.747599 4776 scope.go:117] "RemoveContainer" containerID="e000c40e00cdb0721663c9f10a34e29b92c30e218c7806b93f9c2b3cf4891b05" Jan 28 08:19:48 crc kubenswrapper[4776]: I0128 08:19:48.749954 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s4w5t"] Jan 28 08:19:49 crc kubenswrapper[4776]: I0128 08:19:49.320809 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50e28ba4-3e71-41e2-8b83-11a5756227f1" path="/var/lib/kubelet/pods/50e28ba4-3e71-41e2-8b83-11a5756227f1/volumes" Jan 28 08:19:50 crc kubenswrapper[4776]: I0128 08:19:50.511949 4776 scope.go:117] "RemoveContainer" containerID="c95af90b791a91ab1ac8cdf9d1c10564327e9481cd1bea29bae93e307137ca0b" Jan 28 08:20:02 crc kubenswrapper[4776]: I0128 08:20:02.305264 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:20:02 crc kubenswrapper[4776]: E0128 08:20:02.306228 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:20:17 crc kubenswrapper[4776]: I0128 08:20:17.307292 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:20:17 crc kubenswrapper[4776]: E0128 08:20:17.309349 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:20:31 crc kubenswrapper[4776]: I0128 08:20:31.305751 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:20:31 crc kubenswrapper[4776]: E0128 08:20:31.307007 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:20:46 crc kubenswrapper[4776]: I0128 08:20:46.305327 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:20:46 crc kubenswrapper[4776]: E0128 08:20:46.306094 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:20:58 crc kubenswrapper[4776]: I0128 08:20:58.438765 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w5748"] Jan 28 08:20:58 crc kubenswrapper[4776]: E0128 08:20:58.440403 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e28ba4-3e71-41e2-8b83-11a5756227f1" containerName="registry-server" Jan 28 08:20:58 crc kubenswrapper[4776]: I0128 08:20:58.440429 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e28ba4-3e71-41e2-8b83-11a5756227f1" containerName="registry-server" Jan 28 08:20:58 crc kubenswrapper[4776]: E0128 08:20:58.440468 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e28ba4-3e71-41e2-8b83-11a5756227f1" containerName="extract-utilities" Jan 28 08:20:58 crc kubenswrapper[4776]: I0128 08:20:58.440481 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e28ba4-3e71-41e2-8b83-11a5756227f1" containerName="extract-utilities" Jan 28 08:20:58 crc kubenswrapper[4776]: E0128 08:20:58.440578 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e28ba4-3e71-41e2-8b83-11a5756227f1" containerName="extract-content" Jan 28 08:20:58 crc kubenswrapper[4776]: I0128 08:20:58.440591 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e28ba4-3e71-41e2-8b83-11a5756227f1" containerName="extract-content" Jan 28 08:20:58 crc kubenswrapper[4776]: I0128 08:20:58.440965 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="50e28ba4-3e71-41e2-8b83-11a5756227f1" containerName="registry-server" Jan 28 08:20:58 crc kubenswrapper[4776]: I0128 08:20:58.443579 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5748" Jan 28 08:20:58 crc kubenswrapper[4776]: I0128 08:20:58.457982 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5748"] Jan 28 08:20:58 crc kubenswrapper[4776]: I0128 08:20:58.499876 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-utilities\") pod \"certified-operators-w5748\" (UID: \"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f\") " pod="openshift-marketplace/certified-operators-w5748" Jan 28 08:20:58 crc kubenswrapper[4776]: I0128 08:20:58.499971 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-catalog-content\") pod \"certified-operators-w5748\" (UID: \"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f\") " pod="openshift-marketplace/certified-operators-w5748" Jan 28 08:20:58 crc kubenswrapper[4776]: I0128 08:20:58.500055 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tf74\" (UniqueName: \"kubernetes.io/projected/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-kube-api-access-2tf74\") pod \"certified-operators-w5748\" (UID: \"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f\") " pod="openshift-marketplace/certified-operators-w5748" Jan 28 08:20:58 crc kubenswrapper[4776]: I0128 08:20:58.601678 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-catalog-content\") pod \"certified-operators-w5748\" (UID: \"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f\") " pod="openshift-marketplace/certified-operators-w5748" Jan 28 08:20:58 crc kubenswrapper[4776]: I0128 08:20:58.601764 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tf74\" (UniqueName: \"kubernetes.io/projected/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-kube-api-access-2tf74\") pod \"certified-operators-w5748\" (UID: \"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f\") " pod="openshift-marketplace/certified-operators-w5748" Jan 28 08:20:58 crc kubenswrapper[4776]: I0128 08:20:58.601949 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-utilities\") pod \"certified-operators-w5748\" (UID: \"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f\") " pod="openshift-marketplace/certified-operators-w5748" Jan 28 08:20:58 crc kubenswrapper[4776]: I0128 08:20:58.602385 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-catalog-content\") pod \"certified-operators-w5748\" (UID: \"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f\") " pod="openshift-marketplace/certified-operators-w5748" Jan 28 08:20:58 crc kubenswrapper[4776]: I0128 08:20:58.602449 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-utilities\") pod \"certified-operators-w5748\" (UID: \"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f\") " pod="openshift-marketplace/certified-operators-w5748" Jan 28 08:20:58 crc kubenswrapper[4776]: I0128 08:20:58.631176 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tf74\" (UniqueName: \"kubernetes.io/projected/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-kube-api-access-2tf74\") pod \"certified-operators-w5748\" (UID: \"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f\") " pod="openshift-marketplace/certified-operators-w5748" Jan 28 08:20:58 crc kubenswrapper[4776]: I0128 08:20:58.808067 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5748" Jan 28 08:20:59 crc kubenswrapper[4776]: I0128 08:20:59.325291 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5748"] Jan 28 08:20:59 crc kubenswrapper[4776]: I0128 08:20:59.528209 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5748" event={"ID":"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f","Type":"ContainerStarted","Data":"5e8b8fb481b5969286e9fab96c01c6204d17090c912cc716c7f930e2ec9dd19f"} Jan 28 08:20:59 crc kubenswrapper[4776]: I0128 08:20:59.528249 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5748" event={"ID":"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f","Type":"ContainerStarted","Data":"8c74856ce04a0e160ed78c4474a652d11b521a3079b8978974880d8adae4bb52"} Jan 28 08:21:00 crc kubenswrapper[4776]: I0128 08:21:00.304913 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:21:00 crc kubenswrapper[4776]: E0128 08:21:00.305875 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:21:00 crc kubenswrapper[4776]: I0128 08:21:00.542824 4776 generic.go:334] "Generic (PLEG): container finished" podID="253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f" containerID="5e8b8fb481b5969286e9fab96c01c6204d17090c912cc716c7f930e2ec9dd19f" exitCode=0 Jan 28 08:21:00 crc kubenswrapper[4776]: I0128 08:21:00.542895 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5748" event={"ID":"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f","Type":"ContainerDied","Data":"5e8b8fb481b5969286e9fab96c01c6204d17090c912cc716c7f930e2ec9dd19f"} Jan 28 08:21:00 crc kubenswrapper[4776]: I0128 08:21:00.548062 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 08:21:02 crc kubenswrapper[4776]: I0128 08:21:02.567201 4776 generic.go:334] "Generic (PLEG): container finished" podID="253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f" containerID="16c1590d1cfb9d38504530ce10b26fd0a4a4e06c438d66773037546e400335ef" exitCode=0 Jan 28 08:21:02 crc kubenswrapper[4776]: I0128 08:21:02.567274 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5748" event={"ID":"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f","Type":"ContainerDied","Data":"16c1590d1cfb9d38504530ce10b26fd0a4a4e06c438d66773037546e400335ef"} Jan 28 08:21:03 crc kubenswrapper[4776]: I0128 08:21:03.578611 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5748" event={"ID":"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f","Type":"ContainerStarted","Data":"48bca458457c78732f2733ef245f06c5a4426c3822146239d704b0ccd5ae809e"} Jan 28 08:21:03 crc kubenswrapper[4776]: I0128 08:21:03.598991 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w5748" podStartSLOduration=3.156131911 podStartE2EDuration="5.598969638s" podCreationTimestamp="2026-01-28 08:20:58 +0000 UTC" firstStartedPulling="2026-01-28 08:21:00.547827185 +0000 UTC m=+5431.963487345" lastFinishedPulling="2026-01-28 08:21:02.990664912 +0000 UTC m=+5434.406325072" observedRunningTime="2026-01-28 08:21:03.597422307 +0000 UTC m=+5435.013082467" watchObservedRunningTime="2026-01-28 08:21:03.598969638 +0000 UTC m=+5435.014629818" Jan 28 08:21:08 crc kubenswrapper[4776]: I0128 08:21:08.808657 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w5748" Jan 28 08:21:08 crc kubenswrapper[4776]: I0128 08:21:08.809330 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w5748" Jan 28 08:21:08 crc kubenswrapper[4776]: I0128 08:21:08.898064 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w5748" Jan 28 08:21:09 crc kubenswrapper[4776]: I0128 08:21:09.725282 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w5748" Jan 28 08:21:09 crc kubenswrapper[4776]: I0128 08:21:09.794280 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w5748"] Jan 28 08:21:11 crc kubenswrapper[4776]: I0128 08:21:11.695753 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w5748" podUID="253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f" containerName="registry-server" containerID="cri-o://48bca458457c78732f2733ef245f06c5a4426c3822146239d704b0ccd5ae809e" gracePeriod=2 Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.227335 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5748" Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.305058 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:21:12 crc kubenswrapper[4776]: E0128 08:21:12.305406 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.357505 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tf74\" (UniqueName: \"kubernetes.io/projected/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-kube-api-access-2tf74\") pod \"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f\" (UID: \"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f\") " Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.357941 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-catalog-content\") pod \"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f\" (UID: \"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f\") " Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.358084 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-utilities\") pod \"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f\" (UID: \"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f\") " Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.359539 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-utilities" (OuterVolumeSpecName: "utilities") pod "253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f" (UID: "253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.374005 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-kube-api-access-2tf74" (OuterVolumeSpecName: "kube-api-access-2tf74") pod "253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f" (UID: "253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f"). InnerVolumeSpecName "kube-api-access-2tf74". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.416846 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f" (UID: "253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.463986 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tf74\" (UniqueName: \"kubernetes.io/projected/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-kube-api-access-2tf74\") on node \"crc\" DevicePath \"\"" Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.464012 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.464025 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.715365 4776 generic.go:334] "Generic (PLEG): container finished" podID="253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f" containerID="48bca458457c78732f2733ef245f06c5a4426c3822146239d704b0ccd5ae809e" exitCode=0 Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.715434 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5748" event={"ID":"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f","Type":"ContainerDied","Data":"48bca458457c78732f2733ef245f06c5a4426c3822146239d704b0ccd5ae809e"} Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.715470 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5748" event={"ID":"253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f","Type":"ContainerDied","Data":"8c74856ce04a0e160ed78c4474a652d11b521a3079b8978974880d8adae4bb52"} Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.715495 4776 scope.go:117] "RemoveContainer" containerID="48bca458457c78732f2733ef245f06c5a4426c3822146239d704b0ccd5ae809e" Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.716884 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5748" Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.744640 4776 scope.go:117] "RemoveContainer" containerID="16c1590d1cfb9d38504530ce10b26fd0a4a4e06c438d66773037546e400335ef" Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.778608 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w5748"] Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.789485 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w5748"] Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.795000 4776 scope.go:117] "RemoveContainer" containerID="5e8b8fb481b5969286e9fab96c01c6204d17090c912cc716c7f930e2ec9dd19f" Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.832153 4776 scope.go:117] "RemoveContainer" containerID="48bca458457c78732f2733ef245f06c5a4426c3822146239d704b0ccd5ae809e" Jan 28 08:21:12 crc kubenswrapper[4776]: E0128 08:21:12.832818 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48bca458457c78732f2733ef245f06c5a4426c3822146239d704b0ccd5ae809e\": container with ID starting with 48bca458457c78732f2733ef245f06c5a4426c3822146239d704b0ccd5ae809e not found: ID does not exist" containerID="48bca458457c78732f2733ef245f06c5a4426c3822146239d704b0ccd5ae809e" Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.832865 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48bca458457c78732f2733ef245f06c5a4426c3822146239d704b0ccd5ae809e"} err="failed to get container status \"48bca458457c78732f2733ef245f06c5a4426c3822146239d704b0ccd5ae809e\": rpc error: code = NotFound desc = could not find container \"48bca458457c78732f2733ef245f06c5a4426c3822146239d704b0ccd5ae809e\": container with ID starting with 48bca458457c78732f2733ef245f06c5a4426c3822146239d704b0ccd5ae809e not found: ID does not exist" Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.832899 4776 scope.go:117] "RemoveContainer" containerID="16c1590d1cfb9d38504530ce10b26fd0a4a4e06c438d66773037546e400335ef" Jan 28 08:21:12 crc kubenswrapper[4776]: E0128 08:21:12.833565 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16c1590d1cfb9d38504530ce10b26fd0a4a4e06c438d66773037546e400335ef\": container with ID starting with 16c1590d1cfb9d38504530ce10b26fd0a4a4e06c438d66773037546e400335ef not found: ID does not exist" containerID="16c1590d1cfb9d38504530ce10b26fd0a4a4e06c438d66773037546e400335ef" Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.833618 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16c1590d1cfb9d38504530ce10b26fd0a4a4e06c438d66773037546e400335ef"} err="failed to get container status \"16c1590d1cfb9d38504530ce10b26fd0a4a4e06c438d66773037546e400335ef\": rpc error: code = NotFound desc = could not find container \"16c1590d1cfb9d38504530ce10b26fd0a4a4e06c438d66773037546e400335ef\": container with ID starting with 16c1590d1cfb9d38504530ce10b26fd0a4a4e06c438d66773037546e400335ef not found: ID does not exist" Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.833656 4776 scope.go:117] "RemoveContainer" containerID="5e8b8fb481b5969286e9fab96c01c6204d17090c912cc716c7f930e2ec9dd19f" Jan 28 08:21:12 crc kubenswrapper[4776]: E0128 08:21:12.834244 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e8b8fb481b5969286e9fab96c01c6204d17090c912cc716c7f930e2ec9dd19f\": container with ID starting with 5e8b8fb481b5969286e9fab96c01c6204d17090c912cc716c7f930e2ec9dd19f not found: ID does not exist" containerID="5e8b8fb481b5969286e9fab96c01c6204d17090c912cc716c7f930e2ec9dd19f" Jan 28 08:21:12 crc kubenswrapper[4776]: I0128 08:21:12.834297 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e8b8fb481b5969286e9fab96c01c6204d17090c912cc716c7f930e2ec9dd19f"} err="failed to get container status \"5e8b8fb481b5969286e9fab96c01c6204d17090c912cc716c7f930e2ec9dd19f\": rpc error: code = NotFound desc = could not find container \"5e8b8fb481b5969286e9fab96c01c6204d17090c912cc716c7f930e2ec9dd19f\": container with ID starting with 5e8b8fb481b5969286e9fab96c01c6204d17090c912cc716c7f930e2ec9dd19f not found: ID does not exist" Jan 28 08:21:13 crc kubenswrapper[4776]: I0128 08:21:13.319578 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f" path="/var/lib/kubelet/pods/253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f/volumes" Jan 28 08:21:23 crc kubenswrapper[4776]: I0128 08:21:23.308916 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:21:23 crc kubenswrapper[4776]: E0128 08:21:23.310152 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:21:25 crc kubenswrapper[4776]: I0128 08:21:25.548364 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zk778/must-gather-v5bhp"] Jan 28 08:21:25 crc kubenswrapper[4776]: E0128 08:21:25.554558 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f" containerName="extract-content" Jan 28 08:21:25 crc kubenswrapper[4776]: I0128 08:21:25.554581 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f" containerName="extract-content" Jan 28 08:21:25 crc kubenswrapper[4776]: E0128 08:21:25.554600 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f" containerName="registry-server" Jan 28 08:21:25 crc kubenswrapper[4776]: I0128 08:21:25.554608 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f" containerName="registry-server" Jan 28 08:21:25 crc kubenswrapper[4776]: E0128 08:21:25.554626 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f" containerName="extract-utilities" Jan 28 08:21:25 crc kubenswrapper[4776]: I0128 08:21:25.554635 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f" containerName="extract-utilities" Jan 28 08:21:25 crc kubenswrapper[4776]: I0128 08:21:25.554884 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="253a13e1-de2a-4f8f-bb2f-f4cbd391fa8f" containerName="registry-server" Jan 28 08:21:25 crc kubenswrapper[4776]: I0128 08:21:25.556172 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zk778/must-gather-v5bhp" Jan 28 08:21:25 crc kubenswrapper[4776]: I0128 08:21:25.562158 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zk778"/"openshift-service-ca.crt" Jan 28 08:21:25 crc kubenswrapper[4776]: I0128 08:21:25.562535 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zk778"/"kube-root-ca.crt" Jan 28 08:21:25 crc kubenswrapper[4776]: I0128 08:21:25.572084 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zk778/must-gather-v5bhp"] Jan 28 08:21:25 crc kubenswrapper[4776]: I0128 08:21:25.656258 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4n4t\" (UniqueName: \"kubernetes.io/projected/6b68d949-d849-4a55-a73a-295dac526f50-kube-api-access-r4n4t\") pod \"must-gather-v5bhp\" (UID: \"6b68d949-d849-4a55-a73a-295dac526f50\") " pod="openshift-must-gather-zk778/must-gather-v5bhp" Jan 28 08:21:25 crc kubenswrapper[4776]: I0128 08:21:25.656673 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b68d949-d849-4a55-a73a-295dac526f50-must-gather-output\") pod \"must-gather-v5bhp\" (UID: \"6b68d949-d849-4a55-a73a-295dac526f50\") " pod="openshift-must-gather-zk778/must-gather-v5bhp" Jan 28 08:21:25 crc kubenswrapper[4776]: I0128 08:21:25.758105 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b68d949-d849-4a55-a73a-295dac526f50-must-gather-output\") pod \"must-gather-v5bhp\" (UID: \"6b68d949-d849-4a55-a73a-295dac526f50\") " pod="openshift-must-gather-zk778/must-gather-v5bhp" Jan 28 08:21:25 crc kubenswrapper[4776]: I0128 08:21:25.758198 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4n4t\" (UniqueName: \"kubernetes.io/projected/6b68d949-d849-4a55-a73a-295dac526f50-kube-api-access-r4n4t\") pod \"must-gather-v5bhp\" (UID: \"6b68d949-d849-4a55-a73a-295dac526f50\") " pod="openshift-must-gather-zk778/must-gather-v5bhp" Jan 28 08:21:25 crc kubenswrapper[4776]: I0128 08:21:25.758857 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b68d949-d849-4a55-a73a-295dac526f50-must-gather-output\") pod \"must-gather-v5bhp\" (UID: \"6b68d949-d849-4a55-a73a-295dac526f50\") " pod="openshift-must-gather-zk778/must-gather-v5bhp" Jan 28 08:21:25 crc kubenswrapper[4776]: I0128 08:21:25.782393 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4n4t\" (UniqueName: \"kubernetes.io/projected/6b68d949-d849-4a55-a73a-295dac526f50-kube-api-access-r4n4t\") pod \"must-gather-v5bhp\" (UID: \"6b68d949-d849-4a55-a73a-295dac526f50\") " pod="openshift-must-gather-zk778/must-gather-v5bhp" Jan 28 08:21:25 crc kubenswrapper[4776]: I0128 08:21:25.874480 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zk778/must-gather-v5bhp" Jan 28 08:21:26 crc kubenswrapper[4776]: I0128 08:21:26.426190 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zk778/must-gather-v5bhp"] Jan 28 08:21:26 crc kubenswrapper[4776]: I0128 08:21:26.874796 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zk778/must-gather-v5bhp" event={"ID":"6b68d949-d849-4a55-a73a-295dac526f50","Type":"ContainerStarted","Data":"15f5303e3119a83c3d42403b41ffe77ee425fae414a5879664d5edddb7c6c26d"} Jan 28 08:21:26 crc kubenswrapper[4776]: I0128 08:21:26.875158 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zk778/must-gather-v5bhp" event={"ID":"6b68d949-d849-4a55-a73a-295dac526f50","Type":"ContainerStarted","Data":"64a4a7a88003e145f001d31194e0e41358aaa30f499add7ed07bd5332f45c277"} Jan 28 08:21:27 crc kubenswrapper[4776]: I0128 08:21:27.894900 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zk778/must-gather-v5bhp" event={"ID":"6b68d949-d849-4a55-a73a-295dac526f50","Type":"ContainerStarted","Data":"cde702b39460db9671b040946db74074d9281a9c42bc521b2efb43b9feb6cc30"} Jan 28 08:21:27 crc kubenswrapper[4776]: I0128 08:21:27.923586 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zk778/must-gather-v5bhp" podStartSLOduration=2.923566604 podStartE2EDuration="2.923566604s" podCreationTimestamp="2026-01-28 08:21:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 08:21:27.921120538 +0000 UTC m=+5459.336780738" watchObservedRunningTime="2026-01-28 08:21:27.923566604 +0000 UTC m=+5459.339226774" Jan 28 08:21:30 crc kubenswrapper[4776]: I0128 08:21:30.527329 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zk778/crc-debug-9w979"] Jan 28 08:21:30 crc kubenswrapper[4776]: I0128 08:21:30.529431 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zk778/crc-debug-9w979" Jan 28 08:21:30 crc kubenswrapper[4776]: I0128 08:21:30.532361 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zk778"/"default-dockercfg-h2qnc" Jan 28 08:21:30 crc kubenswrapper[4776]: I0128 08:21:30.566866 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn95f\" (UniqueName: \"kubernetes.io/projected/4e921a43-d690-4656-b7fd-aee654ab62e3-kube-api-access-nn95f\") pod \"crc-debug-9w979\" (UID: \"4e921a43-d690-4656-b7fd-aee654ab62e3\") " pod="openshift-must-gather-zk778/crc-debug-9w979" Jan 28 08:21:30 crc kubenswrapper[4776]: I0128 08:21:30.567056 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e921a43-d690-4656-b7fd-aee654ab62e3-host\") pod \"crc-debug-9w979\" (UID: \"4e921a43-d690-4656-b7fd-aee654ab62e3\") " pod="openshift-must-gather-zk778/crc-debug-9w979" Jan 28 08:21:30 crc kubenswrapper[4776]: I0128 08:21:30.668678 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn95f\" (UniqueName: \"kubernetes.io/projected/4e921a43-d690-4656-b7fd-aee654ab62e3-kube-api-access-nn95f\") pod \"crc-debug-9w979\" (UID: \"4e921a43-d690-4656-b7fd-aee654ab62e3\") " pod="openshift-must-gather-zk778/crc-debug-9w979" Jan 28 08:21:30 crc kubenswrapper[4776]: I0128 08:21:30.668766 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e921a43-d690-4656-b7fd-aee654ab62e3-host\") pod \"crc-debug-9w979\" (UID: \"4e921a43-d690-4656-b7fd-aee654ab62e3\") " pod="openshift-must-gather-zk778/crc-debug-9w979" Jan 28 08:21:30 crc kubenswrapper[4776]: I0128 08:21:30.668887 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e921a43-d690-4656-b7fd-aee654ab62e3-host\") pod \"crc-debug-9w979\" (UID: \"4e921a43-d690-4656-b7fd-aee654ab62e3\") " pod="openshift-must-gather-zk778/crc-debug-9w979" Jan 28 08:21:30 crc kubenswrapper[4776]: I0128 08:21:30.692168 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn95f\" (UniqueName: \"kubernetes.io/projected/4e921a43-d690-4656-b7fd-aee654ab62e3-kube-api-access-nn95f\") pod \"crc-debug-9w979\" (UID: \"4e921a43-d690-4656-b7fd-aee654ab62e3\") " pod="openshift-must-gather-zk778/crc-debug-9w979" Jan 28 08:21:30 crc kubenswrapper[4776]: I0128 08:21:30.853457 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zk778/crc-debug-9w979" Jan 28 08:21:30 crc kubenswrapper[4776]: W0128 08:21:30.902121 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e921a43_d690_4656_b7fd_aee654ab62e3.slice/crio-f229b37be1e4640e88008e30f92ffd39ee2ab62406dfa62c56a3caf1c656e594 WatchSource:0}: Error finding container f229b37be1e4640e88008e30f92ffd39ee2ab62406dfa62c56a3caf1c656e594: Status 404 returned error can't find the container with id f229b37be1e4640e88008e30f92ffd39ee2ab62406dfa62c56a3caf1c656e594 Jan 28 08:21:30 crc kubenswrapper[4776]: I0128 08:21:30.930771 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zk778/crc-debug-9w979" event={"ID":"4e921a43-d690-4656-b7fd-aee654ab62e3","Type":"ContainerStarted","Data":"f229b37be1e4640e88008e30f92ffd39ee2ab62406dfa62c56a3caf1c656e594"} Jan 28 08:21:31 crc kubenswrapper[4776]: I0128 08:21:31.941904 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zk778/crc-debug-9w979" event={"ID":"4e921a43-d690-4656-b7fd-aee654ab62e3","Type":"ContainerStarted","Data":"d1d587bf42268b82ecc20b28e5cd9fe77089d1789e80df73f55a2b5c88a9f1b0"} Jan 28 08:21:31 crc kubenswrapper[4776]: I0128 08:21:31.960337 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zk778/crc-debug-9w979" podStartSLOduration=1.960311468 podStartE2EDuration="1.960311468s" podCreationTimestamp="2026-01-28 08:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 08:21:31.955807266 +0000 UTC m=+5463.371467426" watchObservedRunningTime="2026-01-28 08:21:31.960311468 +0000 UTC m=+5463.375971668" Jan 28 08:21:37 crc kubenswrapper[4776]: I0128 08:21:37.307697 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:21:37 crc kubenswrapper[4776]: E0128 08:21:37.308592 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:21:51 crc kubenswrapper[4776]: I0128 08:21:51.304596 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:21:51 crc kubenswrapper[4776]: E0128 08:21:51.305425 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:22:03 crc kubenswrapper[4776]: I0128 08:22:03.304313 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:22:03 crc kubenswrapper[4776]: E0128 08:22:03.305237 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:22:11 crc kubenswrapper[4776]: I0128 08:22:11.334776 4776 generic.go:334] "Generic (PLEG): container finished" podID="4e921a43-d690-4656-b7fd-aee654ab62e3" containerID="d1d587bf42268b82ecc20b28e5cd9fe77089d1789e80df73f55a2b5c88a9f1b0" exitCode=0 Jan 28 08:22:11 crc kubenswrapper[4776]: I0128 08:22:11.334872 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zk778/crc-debug-9w979" event={"ID":"4e921a43-d690-4656-b7fd-aee654ab62e3","Type":"ContainerDied","Data":"d1d587bf42268b82ecc20b28e5cd9fe77089d1789e80df73f55a2b5c88a9f1b0"} Jan 28 08:22:12 crc kubenswrapper[4776]: I0128 08:22:12.464596 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zk778/crc-debug-9w979" Jan 28 08:22:12 crc kubenswrapper[4776]: I0128 08:22:12.505425 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zk778/crc-debug-9w979"] Jan 28 08:22:12 crc kubenswrapper[4776]: I0128 08:22:12.515570 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zk778/crc-debug-9w979"] Jan 28 08:22:12 crc kubenswrapper[4776]: I0128 08:22:12.606146 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e921a43-d690-4656-b7fd-aee654ab62e3-host\") pod \"4e921a43-d690-4656-b7fd-aee654ab62e3\" (UID: \"4e921a43-d690-4656-b7fd-aee654ab62e3\") " Jan 28 08:22:12 crc kubenswrapper[4776]: I0128 08:22:12.606228 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e921a43-d690-4656-b7fd-aee654ab62e3-host" (OuterVolumeSpecName: "host") pod "4e921a43-d690-4656-b7fd-aee654ab62e3" (UID: "4e921a43-d690-4656-b7fd-aee654ab62e3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 08:22:12 crc kubenswrapper[4776]: I0128 08:22:12.606327 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn95f\" (UniqueName: \"kubernetes.io/projected/4e921a43-d690-4656-b7fd-aee654ab62e3-kube-api-access-nn95f\") pod \"4e921a43-d690-4656-b7fd-aee654ab62e3\" (UID: \"4e921a43-d690-4656-b7fd-aee654ab62e3\") " Jan 28 08:22:12 crc kubenswrapper[4776]: I0128 08:22:12.606730 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e921a43-d690-4656-b7fd-aee654ab62e3-host\") on node \"crc\" DevicePath \"\"" Jan 28 08:22:12 crc kubenswrapper[4776]: I0128 08:22:12.614719 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e921a43-d690-4656-b7fd-aee654ab62e3-kube-api-access-nn95f" (OuterVolumeSpecName: "kube-api-access-nn95f") pod "4e921a43-d690-4656-b7fd-aee654ab62e3" (UID: "4e921a43-d690-4656-b7fd-aee654ab62e3"). InnerVolumeSpecName "kube-api-access-nn95f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:22:12 crc kubenswrapper[4776]: I0128 08:22:12.708607 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn95f\" (UniqueName: \"kubernetes.io/projected/4e921a43-d690-4656-b7fd-aee654ab62e3-kube-api-access-nn95f\") on node \"crc\" DevicePath \"\"" Jan 28 08:22:13 crc kubenswrapper[4776]: I0128 08:22:13.318199 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e921a43-d690-4656-b7fd-aee654ab62e3" path="/var/lib/kubelet/pods/4e921a43-d690-4656-b7fd-aee654ab62e3/volumes" Jan 28 08:22:13 crc kubenswrapper[4776]: I0128 08:22:13.365402 4776 scope.go:117] "RemoveContainer" containerID="d1d587bf42268b82ecc20b28e5cd9fe77089d1789e80df73f55a2b5c88a9f1b0" Jan 28 08:22:13 crc kubenswrapper[4776]: I0128 08:22:13.365455 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zk778/crc-debug-9w979" Jan 28 08:22:13 crc kubenswrapper[4776]: I0128 08:22:13.683945 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zk778/crc-debug-pwmsq"] Jan 28 08:22:13 crc kubenswrapper[4776]: E0128 08:22:13.684481 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e921a43-d690-4656-b7fd-aee654ab62e3" containerName="container-00" Jan 28 08:22:13 crc kubenswrapper[4776]: I0128 08:22:13.684497 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e921a43-d690-4656-b7fd-aee654ab62e3" containerName="container-00" Jan 28 08:22:13 crc kubenswrapper[4776]: I0128 08:22:13.684957 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e921a43-d690-4656-b7fd-aee654ab62e3" containerName="container-00" Jan 28 08:22:13 crc kubenswrapper[4776]: I0128 08:22:13.685785 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zk778/crc-debug-pwmsq" Jan 28 08:22:13 crc kubenswrapper[4776]: I0128 08:22:13.688147 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zk778"/"default-dockercfg-h2qnc" Jan 28 08:22:13 crc kubenswrapper[4776]: I0128 08:22:13.830820 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plpsw\" (UniqueName: \"kubernetes.io/projected/5be936a4-8645-47dd-bf0f-5f90d009f232-kube-api-access-plpsw\") pod \"crc-debug-pwmsq\" (UID: \"5be936a4-8645-47dd-bf0f-5f90d009f232\") " pod="openshift-must-gather-zk778/crc-debug-pwmsq" Jan 28 08:22:13 crc kubenswrapper[4776]: I0128 08:22:13.831805 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5be936a4-8645-47dd-bf0f-5f90d009f232-host\") pod \"crc-debug-pwmsq\" (UID: \"5be936a4-8645-47dd-bf0f-5f90d009f232\") " pod="openshift-must-gather-zk778/crc-debug-pwmsq" Jan 28 08:22:13 crc kubenswrapper[4776]: I0128 08:22:13.933503 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plpsw\" (UniqueName: \"kubernetes.io/projected/5be936a4-8645-47dd-bf0f-5f90d009f232-kube-api-access-plpsw\") pod \"crc-debug-pwmsq\" (UID: \"5be936a4-8645-47dd-bf0f-5f90d009f232\") " pod="openshift-must-gather-zk778/crc-debug-pwmsq" Jan 28 08:22:13 crc kubenswrapper[4776]: I0128 08:22:13.933654 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5be936a4-8645-47dd-bf0f-5f90d009f232-host\") pod \"crc-debug-pwmsq\" (UID: \"5be936a4-8645-47dd-bf0f-5f90d009f232\") " pod="openshift-must-gather-zk778/crc-debug-pwmsq" Jan 28 08:22:13 crc kubenswrapper[4776]: I0128 08:22:13.933754 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5be936a4-8645-47dd-bf0f-5f90d009f232-host\") pod \"crc-debug-pwmsq\" (UID: \"5be936a4-8645-47dd-bf0f-5f90d009f232\") " pod="openshift-must-gather-zk778/crc-debug-pwmsq" Jan 28 08:22:13 crc kubenswrapper[4776]: I0128 08:22:13.951292 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plpsw\" (UniqueName: \"kubernetes.io/projected/5be936a4-8645-47dd-bf0f-5f90d009f232-kube-api-access-plpsw\") pod \"crc-debug-pwmsq\" (UID: \"5be936a4-8645-47dd-bf0f-5f90d009f232\") " pod="openshift-must-gather-zk778/crc-debug-pwmsq" Jan 28 08:22:14 crc kubenswrapper[4776]: I0128 08:22:14.003822 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zk778/crc-debug-pwmsq" Jan 28 08:22:14 crc kubenswrapper[4776]: I0128 08:22:14.377375 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zk778/crc-debug-pwmsq" event={"ID":"5be936a4-8645-47dd-bf0f-5f90d009f232","Type":"ContainerStarted","Data":"f8287c20ca22b72117794ac39a69e119bfd821c2805c22ee1939ad06b19f5901"} Jan 28 08:22:14 crc kubenswrapper[4776]: I0128 08:22:14.377951 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zk778/crc-debug-pwmsq" event={"ID":"5be936a4-8645-47dd-bf0f-5f90d009f232","Type":"ContainerStarted","Data":"06d4a4d7a82214fc200ced6b33c26e39b357613fc0a74159b0a50e2252310384"} Jan 28 08:22:14 crc kubenswrapper[4776]: I0128 08:22:14.396789 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zk778/crc-debug-pwmsq" podStartSLOduration=1.396770601 podStartE2EDuration="1.396770601s" podCreationTimestamp="2026-01-28 08:22:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 08:22:14.393804201 +0000 UTC m=+5505.809464371" watchObservedRunningTime="2026-01-28 08:22:14.396770601 +0000 UTC m=+5505.812430761" Jan 28 08:22:15 crc kubenswrapper[4776]: I0128 08:22:15.390572 4776 generic.go:334] "Generic (PLEG): container finished" podID="5be936a4-8645-47dd-bf0f-5f90d009f232" containerID="f8287c20ca22b72117794ac39a69e119bfd821c2805c22ee1939ad06b19f5901" exitCode=0 Jan 28 08:22:15 crc kubenswrapper[4776]: I0128 08:22:15.390904 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zk778/crc-debug-pwmsq" event={"ID":"5be936a4-8645-47dd-bf0f-5f90d009f232","Type":"ContainerDied","Data":"f8287c20ca22b72117794ac39a69e119bfd821c2805c22ee1939ad06b19f5901"} Jan 28 08:22:16 crc kubenswrapper[4776]: I0128 08:22:16.513196 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zk778/crc-debug-pwmsq" Jan 28 08:22:16 crc kubenswrapper[4776]: I0128 08:22:16.672571 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plpsw\" (UniqueName: \"kubernetes.io/projected/5be936a4-8645-47dd-bf0f-5f90d009f232-kube-api-access-plpsw\") pod \"5be936a4-8645-47dd-bf0f-5f90d009f232\" (UID: \"5be936a4-8645-47dd-bf0f-5f90d009f232\") " Jan 28 08:22:16 crc kubenswrapper[4776]: I0128 08:22:16.673130 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5be936a4-8645-47dd-bf0f-5f90d009f232-host\") pod \"5be936a4-8645-47dd-bf0f-5f90d009f232\" (UID: \"5be936a4-8645-47dd-bf0f-5f90d009f232\") " Jan 28 08:22:16 crc kubenswrapper[4776]: I0128 08:22:16.673589 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5be936a4-8645-47dd-bf0f-5f90d009f232-host" (OuterVolumeSpecName: "host") pod "5be936a4-8645-47dd-bf0f-5f90d009f232" (UID: "5be936a4-8645-47dd-bf0f-5f90d009f232"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 08:22:16 crc kubenswrapper[4776]: I0128 08:22:16.674000 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5be936a4-8645-47dd-bf0f-5f90d009f232-host\") on node \"crc\" DevicePath \"\"" Jan 28 08:22:16 crc kubenswrapper[4776]: I0128 08:22:16.678348 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5be936a4-8645-47dd-bf0f-5f90d009f232-kube-api-access-plpsw" (OuterVolumeSpecName: "kube-api-access-plpsw") pod "5be936a4-8645-47dd-bf0f-5f90d009f232" (UID: "5be936a4-8645-47dd-bf0f-5f90d009f232"). InnerVolumeSpecName "kube-api-access-plpsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:22:16 crc kubenswrapper[4776]: I0128 08:22:16.729379 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zk778/crc-debug-pwmsq"] Jan 28 08:22:16 crc kubenswrapper[4776]: I0128 08:22:16.740481 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zk778/crc-debug-pwmsq"] Jan 28 08:22:16 crc kubenswrapper[4776]: I0128 08:22:16.775629 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plpsw\" (UniqueName: \"kubernetes.io/projected/5be936a4-8645-47dd-bf0f-5f90d009f232-kube-api-access-plpsw\") on node \"crc\" DevicePath \"\"" Jan 28 08:22:17 crc kubenswrapper[4776]: I0128 08:22:17.304714 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:22:17 crc kubenswrapper[4776]: E0128 08:22:17.305301 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:22:17 crc kubenswrapper[4776]: I0128 08:22:17.316169 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5be936a4-8645-47dd-bf0f-5f90d009f232" path="/var/lib/kubelet/pods/5be936a4-8645-47dd-bf0f-5f90d009f232/volumes" Jan 28 08:22:17 crc kubenswrapper[4776]: I0128 08:22:17.406453 4776 scope.go:117] "RemoveContainer" containerID="f8287c20ca22b72117794ac39a69e119bfd821c2805c22ee1939ad06b19f5901" Jan 28 08:22:17 crc kubenswrapper[4776]: I0128 08:22:17.406499 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zk778/crc-debug-pwmsq" Jan 28 08:22:18 crc kubenswrapper[4776]: I0128 08:22:18.015143 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zk778/crc-debug-x9vfp"] Jan 28 08:22:18 crc kubenswrapper[4776]: E0128 08:22:18.015652 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be936a4-8645-47dd-bf0f-5f90d009f232" containerName="container-00" Jan 28 08:22:18 crc kubenswrapper[4776]: I0128 08:22:18.015671 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be936a4-8645-47dd-bf0f-5f90d009f232" containerName="container-00" Jan 28 08:22:18 crc kubenswrapper[4776]: I0128 08:22:18.015977 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5be936a4-8645-47dd-bf0f-5f90d009f232" containerName="container-00" Jan 28 08:22:18 crc kubenswrapper[4776]: I0128 08:22:18.017343 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zk778/crc-debug-x9vfp" Jan 28 08:22:18 crc kubenswrapper[4776]: I0128 08:22:18.021935 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zk778"/"default-dockercfg-h2qnc" Jan 28 08:22:18 crc kubenswrapper[4776]: I0128 08:22:18.102643 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6dc8949b-e5aa-465c-aad4-ae4931fb6fba-host\") pod \"crc-debug-x9vfp\" (UID: \"6dc8949b-e5aa-465c-aad4-ae4931fb6fba\") " pod="openshift-must-gather-zk778/crc-debug-x9vfp" Jan 28 08:22:18 crc kubenswrapper[4776]: I0128 08:22:18.102709 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsxkx\" (UniqueName: \"kubernetes.io/projected/6dc8949b-e5aa-465c-aad4-ae4931fb6fba-kube-api-access-qsxkx\") pod \"crc-debug-x9vfp\" (UID: \"6dc8949b-e5aa-465c-aad4-ae4931fb6fba\") " pod="openshift-must-gather-zk778/crc-debug-x9vfp" Jan 28 08:22:18 crc kubenswrapper[4776]: I0128 08:22:18.204270 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6dc8949b-e5aa-465c-aad4-ae4931fb6fba-host\") pod \"crc-debug-x9vfp\" (UID: \"6dc8949b-e5aa-465c-aad4-ae4931fb6fba\") " pod="openshift-must-gather-zk778/crc-debug-x9vfp" Jan 28 08:22:18 crc kubenswrapper[4776]: I0128 08:22:18.204343 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsxkx\" (UniqueName: \"kubernetes.io/projected/6dc8949b-e5aa-465c-aad4-ae4931fb6fba-kube-api-access-qsxkx\") pod \"crc-debug-x9vfp\" (UID: \"6dc8949b-e5aa-465c-aad4-ae4931fb6fba\") " pod="openshift-must-gather-zk778/crc-debug-x9vfp" Jan 28 08:22:18 crc kubenswrapper[4776]: I0128 08:22:18.204424 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6dc8949b-e5aa-465c-aad4-ae4931fb6fba-host\") pod \"crc-debug-x9vfp\" (UID: \"6dc8949b-e5aa-465c-aad4-ae4931fb6fba\") " pod="openshift-must-gather-zk778/crc-debug-x9vfp" Jan 28 08:22:18 crc kubenswrapper[4776]: I0128 08:22:18.228238 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsxkx\" (UniqueName: \"kubernetes.io/projected/6dc8949b-e5aa-465c-aad4-ae4931fb6fba-kube-api-access-qsxkx\") pod \"crc-debug-x9vfp\" (UID: \"6dc8949b-e5aa-465c-aad4-ae4931fb6fba\") " pod="openshift-must-gather-zk778/crc-debug-x9vfp" Jan 28 08:22:18 crc kubenswrapper[4776]: I0128 08:22:18.342009 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zk778/crc-debug-x9vfp" Jan 28 08:22:18 crc kubenswrapper[4776]: W0128 08:22:18.367437 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dc8949b_e5aa_465c_aad4_ae4931fb6fba.slice/crio-31a4f3adebbad2fe34d6686fa9391146bb6d843622acba450cd361d52e637970 WatchSource:0}: Error finding container 31a4f3adebbad2fe34d6686fa9391146bb6d843622acba450cd361d52e637970: Status 404 returned error can't find the container with id 31a4f3adebbad2fe34d6686fa9391146bb6d843622acba450cd361d52e637970 Jan 28 08:22:18 crc kubenswrapper[4776]: I0128 08:22:18.420424 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zk778/crc-debug-x9vfp" event={"ID":"6dc8949b-e5aa-465c-aad4-ae4931fb6fba","Type":"ContainerStarted","Data":"31a4f3adebbad2fe34d6686fa9391146bb6d843622acba450cd361d52e637970"} Jan 28 08:22:19 crc kubenswrapper[4776]: I0128 08:22:19.438234 4776 generic.go:334] "Generic (PLEG): container finished" podID="6dc8949b-e5aa-465c-aad4-ae4931fb6fba" containerID="030083b2ebb7cfff50f1cbfae19bd1637bde33ab3fbbf954e7a44bf5b11e9fd4" exitCode=0 Jan 28 08:22:19 crc kubenswrapper[4776]: I0128 08:22:19.438298 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zk778/crc-debug-x9vfp" event={"ID":"6dc8949b-e5aa-465c-aad4-ae4931fb6fba","Type":"ContainerDied","Data":"030083b2ebb7cfff50f1cbfae19bd1637bde33ab3fbbf954e7a44bf5b11e9fd4"} Jan 28 08:22:19 crc kubenswrapper[4776]: I0128 08:22:19.489322 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zk778/crc-debug-x9vfp"] Jan 28 08:22:19 crc kubenswrapper[4776]: I0128 08:22:19.504689 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zk778/crc-debug-x9vfp"] Jan 28 08:22:20 crc kubenswrapper[4776]: I0128 08:22:20.543128 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zk778/crc-debug-x9vfp" Jan 28 08:22:20 crc kubenswrapper[4776]: I0128 08:22:20.656615 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6dc8949b-e5aa-465c-aad4-ae4931fb6fba-host\") pod \"6dc8949b-e5aa-465c-aad4-ae4931fb6fba\" (UID: \"6dc8949b-e5aa-465c-aad4-ae4931fb6fba\") " Jan 28 08:22:20 crc kubenswrapper[4776]: I0128 08:22:20.656829 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsxkx\" (UniqueName: \"kubernetes.io/projected/6dc8949b-e5aa-465c-aad4-ae4931fb6fba-kube-api-access-qsxkx\") pod \"6dc8949b-e5aa-465c-aad4-ae4931fb6fba\" (UID: \"6dc8949b-e5aa-465c-aad4-ae4931fb6fba\") " Jan 28 08:22:20 crc kubenswrapper[4776]: I0128 08:22:20.656993 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6dc8949b-e5aa-465c-aad4-ae4931fb6fba-host" (OuterVolumeSpecName: "host") pod "6dc8949b-e5aa-465c-aad4-ae4931fb6fba" (UID: "6dc8949b-e5aa-465c-aad4-ae4931fb6fba"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 08:22:20 crc kubenswrapper[4776]: I0128 08:22:20.657413 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6dc8949b-e5aa-465c-aad4-ae4931fb6fba-host\") on node \"crc\" DevicePath \"\"" Jan 28 08:22:20 crc kubenswrapper[4776]: I0128 08:22:20.666650 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dc8949b-e5aa-465c-aad4-ae4931fb6fba-kube-api-access-qsxkx" (OuterVolumeSpecName: "kube-api-access-qsxkx") pod "6dc8949b-e5aa-465c-aad4-ae4931fb6fba" (UID: "6dc8949b-e5aa-465c-aad4-ae4931fb6fba"). InnerVolumeSpecName "kube-api-access-qsxkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:22:20 crc kubenswrapper[4776]: I0128 08:22:20.759724 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsxkx\" (UniqueName: \"kubernetes.io/projected/6dc8949b-e5aa-465c-aad4-ae4931fb6fba-kube-api-access-qsxkx\") on node \"crc\" DevicePath \"\"" Jan 28 08:22:21 crc kubenswrapper[4776]: I0128 08:22:21.323835 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dc8949b-e5aa-465c-aad4-ae4931fb6fba" path="/var/lib/kubelet/pods/6dc8949b-e5aa-465c-aad4-ae4931fb6fba/volumes" Jan 28 08:22:21 crc kubenswrapper[4776]: I0128 08:22:21.460844 4776 scope.go:117] "RemoveContainer" containerID="030083b2ebb7cfff50f1cbfae19bd1637bde33ab3fbbf954e7a44bf5b11e9fd4" Jan 28 08:22:21 crc kubenswrapper[4776]: I0128 08:22:21.460880 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zk778/crc-debug-x9vfp" Jan 28 08:22:29 crc kubenswrapper[4776]: I0128 08:22:29.313535 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:22:29 crc kubenswrapper[4776]: E0128 08:22:29.314324 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:22:40 crc kubenswrapper[4776]: I0128 08:22:40.305037 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:22:40 crc kubenswrapper[4776]: E0128 08:22:40.307380 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:22:52 crc kubenswrapper[4776]: I0128 08:22:52.305320 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:22:52 crc kubenswrapper[4776]: E0128 08:22:52.306173 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:23:06 crc kubenswrapper[4776]: I0128 08:23:06.230319 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55df755858-s7sbs_52586b79-6cf4-475f-852d-aa3c903b5b38/barbican-api/0.log" Jan 28 08:23:06 crc kubenswrapper[4776]: I0128 08:23:06.304720 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:23:06 crc kubenswrapper[4776]: E0128 08:23:06.305091 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:23:06 crc kubenswrapper[4776]: I0128 08:23:06.376101 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55df755858-s7sbs_52586b79-6cf4-475f-852d-aa3c903b5b38/barbican-api-log/0.log" Jan 28 08:23:06 crc kubenswrapper[4776]: I0128 08:23:06.456687 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f446b9874-8rzlp_3092241e-a9e3-4c51-b31b-36eae29a52e1/barbican-keystone-listener/0.log" Jan 28 08:23:06 crc kubenswrapper[4776]: I0128 08:23:06.607164 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f446b9874-8rzlp_3092241e-a9e3-4c51-b31b-36eae29a52e1/barbican-keystone-listener-log/0.log" Jan 28 08:23:06 crc kubenswrapper[4776]: I0128 08:23:06.654122 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5cc6b97df7-r6gfn_87270d72-c59e-4526-b69b-ceaebfb13fdd/barbican-worker/0.log" Jan 28 08:23:06 crc kubenswrapper[4776]: I0128 08:23:06.718749 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5cc6b97df7-r6gfn_87270d72-c59e-4526-b69b-ceaebfb13fdd/barbican-worker-log/0.log" Jan 28 08:23:06 crc kubenswrapper[4776]: I0128 08:23:06.837583 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-p28tb_9a3467e9-b4e8-40f9-8e96-3615aa7248ca/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:23:06 crc kubenswrapper[4776]: I0128 08:23:06.932665 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_259f2bd2-3855-4ebb-8eeb-1457a26c74ae/ceilometer-central-agent/0.log" Jan 28 08:23:07 crc kubenswrapper[4776]: I0128 08:23:07.023148 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_259f2bd2-3855-4ebb-8eeb-1457a26c74ae/proxy-httpd/0.log" Jan 28 08:23:07 crc kubenswrapper[4776]: I0128 08:23:07.068740 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_259f2bd2-3855-4ebb-8eeb-1457a26c74ae/ceilometer-notification-agent/0.log" Jan 28 08:23:07 crc kubenswrapper[4776]: I0128 08:23:07.096751 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_259f2bd2-3855-4ebb-8eeb-1457a26c74ae/sg-core/0.log" Jan 28 08:23:07 crc kubenswrapper[4776]: I0128 08:23:07.264885 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ce856766-46b4-4498-9aa0-bdf8c0e946db/cinder-api/0.log" Jan 28 08:23:07 crc kubenswrapper[4776]: I0128 08:23:07.308420 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ce856766-46b4-4498-9aa0-bdf8c0e946db/cinder-api-log/0.log" Jan 28 08:23:07 crc kubenswrapper[4776]: I0128 08:23:07.408930 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b39df015-b6fa-40eb-b270-21f01f3cb141/cinder-scheduler/0.log" Jan 28 08:23:07 crc kubenswrapper[4776]: I0128 08:23:07.558683 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b39df015-b6fa-40eb-b270-21f01f3cb141/probe/0.log" Jan 28 08:23:07 crc kubenswrapper[4776]: I0128 08:23:07.588571 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-w925h_aafcd74c-ce06-4b5f-a858-ed32676f7503/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:23:07 crc kubenswrapper[4776]: I0128 08:23:07.757706 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-sv5tt_e525b968-aa0e-4d5a-9fe4-063ce4fdb686/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:23:07 crc kubenswrapper[4776]: I0128 08:23:07.813220 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f4d4c4b7-lt79s_d235d829-cf03-466a-a77d-27bf20dc03a0/init/0.log" Jan 28 08:23:07 crc kubenswrapper[4776]: I0128 08:23:07.911874 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f4d4c4b7-lt79s_d235d829-cf03-466a-a77d-27bf20dc03a0/init/0.log" Jan 28 08:23:08 crc kubenswrapper[4776]: I0128 08:23:08.015160 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-h2ctp_ea52630b-ebcc-41d5-9265-eec1e8ae437d/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:23:08 crc kubenswrapper[4776]: I0128 08:23:08.081227 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f4d4c4b7-lt79s_d235d829-cf03-466a-a77d-27bf20dc03a0/dnsmasq-dns/0.log" Jan 28 08:23:08 crc kubenswrapper[4776]: I0128 08:23:08.189919 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_78f1d6b5-48e7-4ad3-8066-acb3faf83f73/glance-log/0.log" Jan 28 08:23:08 crc kubenswrapper[4776]: I0128 08:23:08.210399 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_78f1d6b5-48e7-4ad3-8066-acb3faf83f73/glance-httpd/0.log" Jan 28 08:23:08 crc kubenswrapper[4776]: I0128 08:23:08.377212 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_38e0faa6-840c-4e44-ad4a-16d42f83e194/glance-log/0.log" Jan 28 08:23:08 crc kubenswrapper[4776]: I0128 08:23:08.386129 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_38e0faa6-840c-4e44-ad4a-16d42f83e194/glance-httpd/0.log" Jan 28 08:23:08 crc kubenswrapper[4776]: I0128 08:23:08.540303 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-755fdfc784-krn2x_dc39478f-fee2-4eb1-89bc-789b5179a1ca/horizon/0.log" Jan 28 08:23:08 crc kubenswrapper[4776]: I0128 08:23:08.662441 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-kqdqw_f05748ac-8e6e-4713-ae86-b0e4ffadec84/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:23:08 crc kubenswrapper[4776]: I0128 08:23:08.883328 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-688gl_a00fc8a0-f777-496b-80d1-2c6e116d6e00/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:23:09 crc kubenswrapper[4776]: I0128 08:23:09.193457 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-755fdfc784-krn2x_dc39478f-fee2-4eb1-89bc-789b5179a1ca/horizon-log/0.log" Jan 28 08:23:09 crc kubenswrapper[4776]: I0128 08:23:09.368469 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29493121-7fxmq_bc1c1a39-5c9e-46c1-ab7c-856b4c1cda9d/keystone-cron/0.log" Jan 28 08:23:09 crc kubenswrapper[4776]: I0128 08:23:09.619987 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_c15b0ff9-2ff0-4eed-821d-ba0da8122d6d/kube-state-metrics/0.log" Jan 28 08:23:09 crc kubenswrapper[4776]: I0128 08:23:09.764038 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-979f97b77-x2lng_9d7ed0f7-4d79-42b7-8f0d-805e6994e958/keystone-api/0.log" Jan 28 08:23:09 crc kubenswrapper[4776]: I0128 08:23:09.810474 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-288pn_5e450505-d924-4be0-8491-92297f012e24/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:23:10 crc kubenswrapper[4776]: I0128 08:23:10.244937 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-5dttg_36edfabc-d31a-4c3f-98d0-3c830a282c65/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:23:10 crc kubenswrapper[4776]: I0128 08:23:10.255487 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7967c58c5f-jkrnc_502a66df-cc30-46b3-98d4-d056d3497547/neutron-httpd/0.log" Jan 28 08:23:10 crc kubenswrapper[4776]: I0128 08:23:10.322013 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7967c58c5f-jkrnc_502a66df-cc30-46b3-98d4-d056d3497547/neutron-api/0.log" Jan 28 08:23:11 crc kubenswrapper[4776]: I0128 08:23:11.400560 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_091b6025-b6c8-4c2b-81d7-7b25aeaef620/nova-cell0-conductor-conductor/0.log" Jan 28 08:23:11 crc kubenswrapper[4776]: I0128 08:23:11.712225 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_10f77e4a-2026-45c0-80c8-a5d8b18046df/nova-cell1-conductor-conductor/0.log" Jan 28 08:23:12 crc kubenswrapper[4776]: I0128 08:23:12.005637 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9ca740aa-b1f4-4878-93f4-116c2c17ff53/nova-api-log/0.log" Jan 28 08:23:12 crc kubenswrapper[4776]: I0128 08:23:12.049205 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5a1237de-1bcc-4b1b-bff5-a775162f3ed9/nova-cell1-novncproxy-novncproxy/0.log" Jan 28 08:23:12 crc kubenswrapper[4776]: I0128 08:23:12.208777 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-gtkx6_5f807cd7-856d-4fd5-afe4-963a4a77a5bf/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:23:12 crc kubenswrapper[4776]: I0128 08:23:12.420419 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_31eb87d0-ab51-4738-8205-b515b8b57cf1/nova-metadata-log/0.log" Jan 28 08:23:12 crc kubenswrapper[4776]: I0128 08:23:12.672903 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9ca740aa-b1f4-4878-93f4-116c2c17ff53/nova-api-api/0.log" Jan 28 08:23:12 crc kubenswrapper[4776]: I0128 08:23:12.874769 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b485d028-58ae-46ec-afd9-720d1a05bade/mysql-bootstrap/0.log" Jan 28 08:23:13 crc kubenswrapper[4776]: I0128 08:23:13.003875 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_7e68a0f9-2ffb-43a1-8945-37b6b68b2d43/nova-scheduler-scheduler/0.log" Jan 28 08:23:13 crc kubenswrapper[4776]: I0128 08:23:13.019101 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b485d028-58ae-46ec-afd9-720d1a05bade/mysql-bootstrap/0.log" Jan 28 08:23:13 crc kubenswrapper[4776]: I0128 08:23:13.074715 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b485d028-58ae-46ec-afd9-720d1a05bade/galera/0.log" Jan 28 08:23:13 crc kubenswrapper[4776]: I0128 08:23:13.239855 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e46126d4-c96e-4d66-9a2e-7f6873a6a1dd/mysql-bootstrap/0.log" Jan 28 08:23:13 crc kubenswrapper[4776]: I0128 08:23:13.426580 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e46126d4-c96e-4d66-9a2e-7f6873a6a1dd/mysql-bootstrap/0.log" Jan 28 08:23:13 crc kubenswrapper[4776]: I0128 08:23:13.437574 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e46126d4-c96e-4d66-9a2e-7f6873a6a1dd/galera/0.log" Jan 28 08:23:13 crc kubenswrapper[4776]: I0128 08:23:13.642027 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_abb12d44-fb9e-4ac4-95ad-a82606ff0709/openstackclient/0.log" Jan 28 08:23:13 crc kubenswrapper[4776]: I0128 08:23:13.678485 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-lrzjl_4d2cb31b-ab97-4714-9978-225821819328/ovn-controller/0.log" Jan 28 08:23:13 crc kubenswrapper[4776]: I0128 08:23:13.831266 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-59r62_a390a227-8301-4ed3-80ee-06131089f499/openstack-network-exporter/0.log" Jan 28 08:23:14 crc kubenswrapper[4776]: I0128 08:23:14.035938 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mbldl_7e98ecc8-0f85-413e-9b5a-4fe838eb9925/ovsdb-server-init/0.log" Jan 28 08:23:14 crc kubenswrapper[4776]: I0128 08:23:14.284759 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mbldl_7e98ecc8-0f85-413e-9b5a-4fe838eb9925/ovsdb-server/0.log" Jan 28 08:23:14 crc kubenswrapper[4776]: I0128 08:23:14.297176 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mbldl_7e98ecc8-0f85-413e-9b5a-4fe838eb9925/ovsdb-server-init/0.log" Jan 28 08:23:14 crc kubenswrapper[4776]: I0128 08:23:14.339965 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mbldl_7e98ecc8-0f85-413e-9b5a-4fe838eb9925/ovs-vswitchd/0.log" Jan 28 08:23:14 crc kubenswrapper[4776]: I0128 08:23:14.516368 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_31eb87d0-ab51-4738-8205-b515b8b57cf1/nova-metadata-metadata/0.log" Jan 28 08:23:14 crc kubenswrapper[4776]: I0128 08:23:14.591057 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-kpxh8_34505caa-b76e-404f-b71a-a863e549d905/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:23:14 crc kubenswrapper[4776]: I0128 08:23:14.732101 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_aff78447-a04f-4c5b-871f-3b47df7325c8/ovn-northd/0.log" Jan 28 08:23:14 crc kubenswrapper[4776]: I0128 08:23:14.735560 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_aff78447-a04f-4c5b-871f-3b47df7325c8/openstack-network-exporter/0.log" Jan 28 08:23:14 crc kubenswrapper[4776]: I0128 08:23:14.788856 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fd4193d3-abf1-457c-a774-de938b12b909/openstack-network-exporter/0.log" Jan 28 08:23:14 crc kubenswrapper[4776]: I0128 08:23:14.969670 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e356c55c-adea-433d-9f03-a403f330b085/openstack-network-exporter/0.log" Jan 28 08:23:14 crc kubenswrapper[4776]: I0128 08:23:14.995693 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fd4193d3-abf1-457c-a774-de938b12b909/ovsdbserver-nb/0.log" Jan 28 08:23:15 crc kubenswrapper[4776]: I0128 08:23:15.066116 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e356c55c-adea-433d-9f03-a403f330b085/ovsdbserver-sb/0.log" Jan 28 08:23:15 crc kubenswrapper[4776]: I0128 08:23:15.381815 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-65b6d456f6-wvlnv_388a79ff-9e00-4cc1-a935-20a9b00402a8/placement-api/0.log" Jan 28 08:23:15 crc kubenswrapper[4776]: I0128 08:23:15.517496 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5a41440-d466-4d04-adb9-13760bb7977a/init-config-reloader/0.log" Jan 28 08:23:15 crc kubenswrapper[4776]: I0128 08:23:15.524382 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-65b6d456f6-wvlnv_388a79ff-9e00-4cc1-a935-20a9b00402a8/placement-log/0.log" Jan 28 08:23:15 crc kubenswrapper[4776]: I0128 08:23:15.643023 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5a41440-d466-4d04-adb9-13760bb7977a/init-config-reloader/0.log" Jan 28 08:23:15 crc kubenswrapper[4776]: I0128 08:23:15.656476 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5a41440-d466-4d04-adb9-13760bb7977a/config-reloader/0.log" Jan 28 08:23:15 crc kubenswrapper[4776]: I0128 08:23:15.700773 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5a41440-d466-4d04-adb9-13760bb7977a/prometheus/0.log" Jan 28 08:23:15 crc kubenswrapper[4776]: I0128 08:23:15.836224 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e5a41440-d466-4d04-adb9-13760bb7977a/thanos-sidecar/0.log" Jan 28 08:23:15 crc kubenswrapper[4776]: I0128 08:23:15.849664 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1/setup-container/0.log" Jan 28 08:23:16 crc kubenswrapper[4776]: I0128 08:23:16.073604 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1/setup-container/0.log" Jan 28 08:23:16 crc kubenswrapper[4776]: I0128 08:23:16.085241 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_34ca2bb8-f3ea-4cca-8def-c0f7feb37ac1/rabbitmq/0.log" Jan 28 08:23:16 crc kubenswrapper[4776]: I0128 08:23:16.181262 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4cfd0885-0776-471c-b8f4-afb359e460b2/setup-container/0.log" Jan 28 08:23:16 crc kubenswrapper[4776]: I0128 08:23:16.467481 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-42xlm_e7a8bf84-fa6c-4637-bac6-cb9da6206f31/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:23:16 crc kubenswrapper[4776]: I0128 08:23:16.475445 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4cfd0885-0776-471c-b8f4-afb359e460b2/setup-container/0.log" Jan 28 08:23:16 crc kubenswrapper[4776]: I0128 08:23:16.487350 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4cfd0885-0776-471c-b8f4-afb359e460b2/rabbitmq/0.log" Jan 28 08:23:16 crc kubenswrapper[4776]: I0128 08:23:16.738075 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-m4zrt_de16818c-1081-4db9-a329-04c845b7ec51/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:23:16 crc kubenswrapper[4776]: I0128 08:23:16.841263 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-g4mf8_496f75bc-3d43-4af8-8bf4-c818f9b4db9d/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:23:16 crc kubenswrapper[4776]: I0128 08:23:16.935197 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-zcz2d_fa499a35-59bb-4ee1-93b4-98ab890c2126/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:23:17 crc kubenswrapper[4776]: I0128 08:23:17.116654 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-vb66j_f2d64cd0-fb47-4169-88d3-dec3bb7591b0/ssh-known-hosts-edpm-deployment/0.log" Jan 28 08:23:17 crc kubenswrapper[4776]: I0128 08:23:17.313815 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76c7dcc8f9-s7l5r_157a7da1-1327-40fc-83f3-30c1ef472c78/proxy-server/0.log" Jan 28 08:23:17 crc kubenswrapper[4776]: I0128 08:23:17.426890 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-cmmlx_a83b6bd4-3813-465a-aa62-8bb029d2fcc0/swift-ring-rebalance/0.log" Jan 28 08:23:17 crc kubenswrapper[4776]: I0128 08:23:17.427459 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76c7dcc8f9-s7l5r_157a7da1-1327-40fc-83f3-30c1ef472c78/proxy-httpd/0.log" Jan 28 08:23:17 crc kubenswrapper[4776]: I0128 08:23:17.617325 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/account-auditor/0.log" Jan 28 08:23:17 crc kubenswrapper[4776]: I0128 08:23:17.679336 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/account-reaper/0.log" Jan 28 08:23:17 crc kubenswrapper[4776]: I0128 08:23:17.732968 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/account-replicator/0.log" Jan 28 08:23:17 crc kubenswrapper[4776]: I0128 08:23:17.747411 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/account-server/0.log" Jan 28 08:23:17 crc kubenswrapper[4776]: I0128 08:23:17.906389 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/container-auditor/0.log" Jan 28 08:23:17 crc kubenswrapper[4776]: I0128 08:23:17.942735 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/container-server/0.log" Jan 28 08:23:17 crc kubenswrapper[4776]: I0128 08:23:17.949719 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/container-replicator/0.log" Jan 28 08:23:18 crc kubenswrapper[4776]: I0128 08:23:18.032898 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/container-updater/0.log" Jan 28 08:23:18 crc kubenswrapper[4776]: I0128 08:23:18.141112 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/object-expirer/0.log" Jan 28 08:23:18 crc kubenswrapper[4776]: I0128 08:23:18.156606 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/object-auditor/0.log" Jan 28 08:23:18 crc kubenswrapper[4776]: I0128 08:23:18.252474 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/object-server/0.log" Jan 28 08:23:18 crc kubenswrapper[4776]: I0128 08:23:18.319100 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/object-replicator/0.log" Jan 28 08:23:18 crc kubenswrapper[4776]: I0128 08:23:18.330160 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/rsync/0.log" Jan 28 08:23:18 crc kubenswrapper[4776]: I0128 08:23:18.361601 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/object-updater/0.log" Jan 28 08:23:18 crc kubenswrapper[4776]: I0128 08:23:18.433364 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_331ac509-cce0-4545-ac41-1224aae65295/swift-recon-cron/0.log" Jan 28 08:23:18 crc kubenswrapper[4776]: I0128 08:23:18.605443 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-fp67m_1dde4f71-00f6-46fa-b16c-429edb9ee1ce/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:23:18 crc kubenswrapper[4776]: I0128 08:23:18.688683 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_0605b294-d429-4bfd-8924-39f8cb5cb105/tempest-tests-tempest-tests-runner/0.log" Jan 28 08:23:18 crc kubenswrapper[4776]: I0128 08:23:18.830358 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_40947156-8378-437f-935a-da00e0908508/test-operator-logs-container/0.log" Jan 28 08:23:18 crc kubenswrapper[4776]: I0128 08:23:18.917446 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-tkr5b_cb26e5c4-e4de-4bca-86c0-160dffb2bb73/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 08:23:19 crc kubenswrapper[4776]: I0128 08:23:19.315089 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:23:19 crc kubenswrapper[4776]: E0128 08:23:19.315446 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:23:19 crc kubenswrapper[4776]: I0128 08:23:19.657100 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_e0bb7f08-c9fa-4595-9b5e-b80ff3821169/watcher-applier/0.log" Jan 28 08:23:20 crc kubenswrapper[4776]: I0128 08:23:20.007844 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_8c06de04-7886-4696-8416-3559c16a5f7f/watcher-api-log/0.log" Jan 28 08:23:20 crc kubenswrapper[4776]: I0128 08:23:20.753330 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_c2cf5aeb-349c-47d3-989b-d56e91f7ff51/watcher-decision-engine/0.log" Jan 28 08:23:23 crc kubenswrapper[4776]: I0128 08:23:23.114058 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_8c06de04-7886-4696-8416-3559c16a5f7f/watcher-api/0.log" Jan 28 08:23:24 crc kubenswrapper[4776]: I0128 08:23:24.682751 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f1377523-89dd-4311-886a-af2f7bb607b8/memcached/0.log" Jan 28 08:23:31 crc kubenswrapper[4776]: I0128 08:23:31.304683 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:23:31 crc kubenswrapper[4776]: E0128 08:23:31.306267 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:23:42 crc kubenswrapper[4776]: I0128 08:23:42.305395 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:23:43 crc kubenswrapper[4776]: I0128 08:23:43.248744 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"c2b92e7ab2ce3f30db74b1079e7ac76f906a1d06093b9796e9ab6c35e1696edc"} Jan 28 08:23:48 crc kubenswrapper[4776]: I0128 08:23:48.702264 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-65ff799cfd-vzbmt_bd49109f-40b2-4db9-92d7-75aaf1093a21/manager/0.log" Jan 28 08:23:48 crc kubenswrapper[4776]: I0128 08:23:48.873098 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-655bf9cfbb-wjdst_79937ab5-c85f-4a4a-b35f-3b5d3711cbf0/manager/0.log" Jan 28 08:23:48 crc kubenswrapper[4776]: I0128 08:23:48.896992 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-77554cdc5c-vpc6t_b5c3560a-18be-4f65-a9f7-0dddccb36193/manager/0.log" Jan 28 08:23:49 crc kubenswrapper[4776]: I0128 08:23:49.060674 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v_29726b5f-7cef-4a70-8004-88f628782852/util/0.log" Jan 28 08:23:49 crc kubenswrapper[4776]: I0128 08:23:49.220113 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v_29726b5f-7cef-4a70-8004-88f628782852/pull/0.log" Jan 28 08:23:49 crc kubenswrapper[4776]: I0128 08:23:49.221447 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v_29726b5f-7cef-4a70-8004-88f628782852/util/0.log" Jan 28 08:23:49 crc kubenswrapper[4776]: I0128 08:23:49.227441 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v_29726b5f-7cef-4a70-8004-88f628782852/pull/0.log" Jan 28 08:23:49 crc kubenswrapper[4776]: I0128 08:23:49.410632 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v_29726b5f-7cef-4a70-8004-88f628782852/util/0.log" Jan 28 08:23:49 crc kubenswrapper[4776]: I0128 08:23:49.422981 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v_29726b5f-7cef-4a70-8004-88f628782852/extract/0.log" Jan 28 08:23:49 crc kubenswrapper[4776]: I0128 08:23:49.435314 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f70a47a7c14c28a6dd596accf9fe8b041f88db6dd32769b415dddf7c95h9t2v_29726b5f-7cef-4a70-8004-88f628782852/pull/0.log" Jan 28 08:23:49 crc kubenswrapper[4776]: I0128 08:23:49.587810 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-575ffb885b-pxsb4_1a0ddddf-b0e4-4bdb-bf00-c978366213a0/manager/0.log" Jan 28 08:23:49 crc kubenswrapper[4776]: I0128 08:23:49.660438 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-67dd55ff59-xw2v6_11a6de65-3758-4462-b2b0-9499232f8c29/manager/0.log" Jan 28 08:23:49 crc kubenswrapper[4776]: I0128 08:23:49.814213 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-vwpnd_60dda427-fb0c-41c7-8ca8-9847554068f1/manager/0.log" Jan 28 08:23:50 crc kubenswrapper[4776]: I0128 08:23:50.094345 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-768b776ffb-4mt8c_846af064-1eb1-4384-9b88-95770199bcdc/manager/0.log" Jan 28 08:23:50 crc kubenswrapper[4776]: I0128 08:23:50.107807 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d75bc88d5-2fbq4_f9f1432a-2977-49f8-924a-5c82c86f1de0/manager/0.log" Jan 28 08:23:50 crc kubenswrapper[4776]: I0128 08:23:50.271213 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55f684fd56-d2j6c_bf30e81e-a5a3-4af7-9a47-673f431d3666/manager/0.log" Jan 28 08:23:50 crc kubenswrapper[4776]: I0128 08:23:50.312523 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-849fcfbb6b-pxjl4_39d3648e-5826-4e8a-b252-cb75e28651db/manager/0.log" Jan 28 08:23:50 crc kubenswrapper[4776]: I0128 08:23:50.461725 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6b9fb5fdcb-lltvt_93dd9036-0e5e-4817-9a6c-eb89469de01b/manager/0.log" Jan 28 08:23:50 crc kubenswrapper[4776]: I0128 08:23:50.637318 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7ffd8d76d4-vrlcf_38646136-0a67-43c4-90ee-d88ae407d654/manager/0.log" Jan 28 08:23:50 crc kubenswrapper[4776]: I0128 08:23:50.790057 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-ddcbfd695-wwhp5_6aa705d0-91c0-48eb-a5ed-ab6afb16b6f7/manager/0.log" Jan 28 08:23:50 crc kubenswrapper[4776]: I0128 08:23:50.793630 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-94dd99d7d-gxmgb_a4407ded-de50-4ae5-bf84-2d6a3baa565c/manager/0.log" Jan 28 08:23:50 crc kubenswrapper[4776]: I0128 08:23:50.934501 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854hbb56_9eae60fd-6135-4e41-bb77-e3caae71237d/manager/0.log" Jan 28 08:23:51 crc kubenswrapper[4776]: I0128 08:23:51.111220 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5667c869b5-csrzg_bca7b855-4473-4cc2-aa88-38fd3de8fea8/operator/0.log" Jan 28 08:23:51 crc kubenswrapper[4776]: I0128 08:23:51.350193 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-njlm4_ce3e5ab9-01db-487b-9176-60b655f03b9b/registry-server/0.log" Jan 28 08:23:51 crc kubenswrapper[4776]: I0128 08:23:51.552999 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f75f45d54-4xx26_6a91b170-b0ed-4156-a9ee-74efca2560e7/manager/0.log" Jan 28 08:23:51 crc kubenswrapper[4776]: I0128 08:23:51.707651 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-79d5ccc684-6kqqd_9390b35e-9791-4ef4-ab66-12c4662f4cdf/manager/0.log" Jan 28 08:23:51 crc kubenswrapper[4776]: I0128 08:23:51.961426 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-8wdwl_22f3f762-cc29-4a18-8bfc-430b85e041cc/operator/0.log" Jan 28 08:23:52 crc kubenswrapper[4776]: I0128 08:23:52.090669 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-547cbdb99f-8w6p2_5bc7efb1-0792-40f2-993a-eb865919048c/manager/0.log" Jan 28 08:23:52 crc kubenswrapper[4776]: I0128 08:23:52.292152 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6fdbb46688-tnzx5_70aa7185-ded8-4807-822c-69fc5b03feeb/manager/0.log" Jan 28 08:23:52 crc kubenswrapper[4776]: I0128 08:23:52.528273 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-799bc87c89-jfcj6_6f563213-8471-44f5-83aa-820e73ed7746/manager/0.log" Jan 28 08:23:52 crc kubenswrapper[4776]: I0128 08:23:52.875133 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-69797bbcbd-8dl97_2f79777a-6f48-42d4-b39e-4393e932aea0/manager/0.log" Jan 28 08:23:52 crc kubenswrapper[4776]: I0128 08:23:52.950121 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-66fbd46fdf-dpq5g_3b6f6ae6-4641-4dd2-9021-197e9ea97b2b/manager/0.log" Jan 28 08:24:15 crc kubenswrapper[4776]: I0128 08:24:15.762037 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-5bxjg_a8386b67-8be2-4d18-9358-fccd65c363db/control-plane-machine-set-operator/0.log" Jan 28 08:24:15 crc kubenswrapper[4776]: I0128 08:24:15.794335 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xm8h7_4fb7ccb2-9c11-4273-9888-f45aea05803d/kube-rbac-proxy/0.log" Jan 28 08:24:15 crc kubenswrapper[4776]: I0128 08:24:15.937646 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xm8h7_4fb7ccb2-9c11-4273-9888-f45aea05803d/machine-api-operator/0.log" Jan 28 08:24:29 crc kubenswrapper[4776]: I0128 08:24:29.931992 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-kh746_ebf51615-2906-4bc1-9224-7bdc14f6afa6/cert-manager-controller/0.log" Jan 28 08:24:30 crc kubenswrapper[4776]: I0128 08:24:30.113747 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-jrkz5_8c08bbc8-20fd-452e-8d53-4baa6ac41fc2/cert-manager-cainjector/0.log" Jan 28 08:24:30 crc kubenswrapper[4776]: I0128 08:24:30.161722 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-5lrjh_24f93919-f6ec-481d-b6f3-0bfd6fdb7e01/cert-manager-webhook/0.log" Jan 28 08:24:41 crc kubenswrapper[4776]: I0128 08:24:41.320694 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hmc76"] Jan 28 08:24:41 crc kubenswrapper[4776]: E0128 08:24:41.321674 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc8949b-e5aa-465c-aad4-ae4931fb6fba" containerName="container-00" Jan 28 08:24:41 crc kubenswrapper[4776]: I0128 08:24:41.321688 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc8949b-e5aa-465c-aad4-ae4931fb6fba" containerName="container-00" Jan 28 08:24:41 crc kubenswrapper[4776]: I0128 08:24:41.321901 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dc8949b-e5aa-465c-aad4-ae4931fb6fba" containerName="container-00" Jan 28 08:24:41 crc kubenswrapper[4776]: I0128 08:24:41.323293 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hmc76" Jan 28 08:24:41 crc kubenswrapper[4776]: I0128 08:24:41.347937 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hmc76"] Jan 28 08:24:41 crc kubenswrapper[4776]: I0128 08:24:41.462612 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpgtz\" (UniqueName: \"kubernetes.io/projected/0896ae57-06f4-4a58-8b46-d2853d5fae75-kube-api-access-bpgtz\") pod \"redhat-marketplace-hmc76\" (UID: \"0896ae57-06f4-4a58-8b46-d2853d5fae75\") " pod="openshift-marketplace/redhat-marketplace-hmc76" Jan 28 08:24:41 crc kubenswrapper[4776]: I0128 08:24:41.462758 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0896ae57-06f4-4a58-8b46-d2853d5fae75-utilities\") pod \"redhat-marketplace-hmc76\" (UID: \"0896ae57-06f4-4a58-8b46-d2853d5fae75\") " pod="openshift-marketplace/redhat-marketplace-hmc76" Jan 28 08:24:41 crc kubenswrapper[4776]: I0128 08:24:41.462829 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0896ae57-06f4-4a58-8b46-d2853d5fae75-catalog-content\") pod \"redhat-marketplace-hmc76\" (UID: \"0896ae57-06f4-4a58-8b46-d2853d5fae75\") " pod="openshift-marketplace/redhat-marketplace-hmc76" Jan 28 08:24:41 crc kubenswrapper[4776]: I0128 08:24:41.564806 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpgtz\" (UniqueName: \"kubernetes.io/projected/0896ae57-06f4-4a58-8b46-d2853d5fae75-kube-api-access-bpgtz\") pod \"redhat-marketplace-hmc76\" (UID: \"0896ae57-06f4-4a58-8b46-d2853d5fae75\") " pod="openshift-marketplace/redhat-marketplace-hmc76" Jan 28 08:24:41 crc kubenswrapper[4776]: I0128 08:24:41.564950 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0896ae57-06f4-4a58-8b46-d2853d5fae75-utilities\") pod \"redhat-marketplace-hmc76\" (UID: \"0896ae57-06f4-4a58-8b46-d2853d5fae75\") " pod="openshift-marketplace/redhat-marketplace-hmc76" Jan 28 08:24:41 crc kubenswrapper[4776]: I0128 08:24:41.565018 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0896ae57-06f4-4a58-8b46-d2853d5fae75-catalog-content\") pod \"redhat-marketplace-hmc76\" (UID: \"0896ae57-06f4-4a58-8b46-d2853d5fae75\") " pod="openshift-marketplace/redhat-marketplace-hmc76" Jan 28 08:24:41 crc kubenswrapper[4776]: I0128 08:24:41.565674 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0896ae57-06f4-4a58-8b46-d2853d5fae75-utilities\") pod \"redhat-marketplace-hmc76\" (UID: \"0896ae57-06f4-4a58-8b46-d2853d5fae75\") " pod="openshift-marketplace/redhat-marketplace-hmc76" Jan 28 08:24:41 crc kubenswrapper[4776]: I0128 08:24:41.565766 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0896ae57-06f4-4a58-8b46-d2853d5fae75-catalog-content\") pod \"redhat-marketplace-hmc76\" (UID: \"0896ae57-06f4-4a58-8b46-d2853d5fae75\") " pod="openshift-marketplace/redhat-marketplace-hmc76" Jan 28 08:24:41 crc kubenswrapper[4776]: I0128 08:24:41.585449 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpgtz\" (UniqueName: \"kubernetes.io/projected/0896ae57-06f4-4a58-8b46-d2853d5fae75-kube-api-access-bpgtz\") pod \"redhat-marketplace-hmc76\" (UID: \"0896ae57-06f4-4a58-8b46-d2853d5fae75\") " pod="openshift-marketplace/redhat-marketplace-hmc76" Jan 28 08:24:41 crc kubenswrapper[4776]: I0128 08:24:41.647250 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hmc76" Jan 28 08:24:42 crc kubenswrapper[4776]: I0128 08:24:42.122726 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hmc76"] Jan 28 08:24:42 crc kubenswrapper[4776]: I0128 08:24:42.852273 4776 generic.go:334] "Generic (PLEG): container finished" podID="0896ae57-06f4-4a58-8b46-d2853d5fae75" containerID="5235f8f8f9bfa2e04611cfd67cc9a1859b07f23302e575a341bbdaed7a5d4d23" exitCode=0 Jan 28 08:24:42 crc kubenswrapper[4776]: I0128 08:24:42.852324 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hmc76" event={"ID":"0896ae57-06f4-4a58-8b46-d2853d5fae75","Type":"ContainerDied","Data":"5235f8f8f9bfa2e04611cfd67cc9a1859b07f23302e575a341bbdaed7a5d4d23"} Jan 28 08:24:42 crc kubenswrapper[4776]: I0128 08:24:42.852637 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hmc76" event={"ID":"0896ae57-06f4-4a58-8b46-d2853d5fae75","Type":"ContainerStarted","Data":"f4e6f579f35b4d0fd386f587c95aa8115dad6a31e2c8820c4470de92e4ef7ab7"} Jan 28 08:24:43 crc kubenswrapper[4776]: I0128 08:24:43.863984 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hmc76" event={"ID":"0896ae57-06f4-4a58-8b46-d2853d5fae75","Type":"ContainerStarted","Data":"b44db00d42a077e1effb0bca635a8c82084fb9a62ef98c375d1cffed933eba1e"} Jan 28 08:24:44 crc kubenswrapper[4776]: I0128 08:24:44.873830 4776 generic.go:334] "Generic (PLEG): container finished" podID="0896ae57-06f4-4a58-8b46-d2853d5fae75" containerID="b44db00d42a077e1effb0bca635a8c82084fb9a62ef98c375d1cffed933eba1e" exitCode=0 Jan 28 08:24:44 crc kubenswrapper[4776]: I0128 08:24:44.874116 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hmc76" event={"ID":"0896ae57-06f4-4a58-8b46-d2853d5fae75","Type":"ContainerDied","Data":"b44db00d42a077e1effb0bca635a8c82084fb9a62ef98c375d1cffed933eba1e"} Jan 28 08:24:45 crc kubenswrapper[4776]: I0128 08:24:45.310455 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-49nnl_59581e1b-5fa1-4649-b461-20815879a250/nmstate-console-plugin/0.log" Jan 28 08:24:46 crc kubenswrapper[4776]: I0128 08:24:46.031949 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-qgksg_2f1d6d84-d95e-4423-a7c1-7fa987beff1c/nmstate-metrics/0.log" Jan 28 08:24:46 crc kubenswrapper[4776]: I0128 08:24:46.050349 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-fn6dp_087b920d-366b-475c-85f2-e5512596d3f8/nmstate-handler/0.log" Jan 28 08:24:46 crc kubenswrapper[4776]: I0128 08:24:46.062583 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-qgksg_2f1d6d84-d95e-4423-a7c1-7fa987beff1c/kube-rbac-proxy/0.log" Jan 28 08:24:46 crc kubenswrapper[4776]: I0128 08:24:46.242582 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-5x9ww_180b60f1-288a-4292-9aab-4322b1d1bce2/nmstate-operator/0.log" Jan 28 08:24:46 crc kubenswrapper[4776]: I0128 08:24:46.378195 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-mjfqk_ff12b52a-7e92-45bd-afd9-e0b577a8607d/nmstate-webhook/0.log" Jan 28 08:24:46 crc kubenswrapper[4776]: I0128 08:24:46.900565 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hmc76" event={"ID":"0896ae57-06f4-4a58-8b46-d2853d5fae75","Type":"ContainerStarted","Data":"e4fadcd082f81845b91a1111585cff1487716e2b306615113092412d1d949249"} Jan 28 08:24:46 crc kubenswrapper[4776]: I0128 08:24:46.921369 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hmc76" podStartSLOduration=3.499448736 podStartE2EDuration="5.921354184s" podCreationTimestamp="2026-01-28 08:24:41 +0000 UTC" firstStartedPulling="2026-01-28 08:24:42.855073072 +0000 UTC m=+5654.270733232" lastFinishedPulling="2026-01-28 08:24:45.27697852 +0000 UTC m=+5656.692638680" observedRunningTime="2026-01-28 08:24:46.919886854 +0000 UTC m=+5658.335547014" watchObservedRunningTime="2026-01-28 08:24:46.921354184 +0000 UTC m=+5658.337014344" Jan 28 08:24:51 crc kubenswrapper[4776]: I0128 08:24:51.648335 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hmc76" Jan 28 08:24:51 crc kubenswrapper[4776]: I0128 08:24:51.648875 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hmc76" Jan 28 08:24:51 crc kubenswrapper[4776]: I0128 08:24:51.723401 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hmc76" Jan 28 08:24:52 crc kubenswrapper[4776]: I0128 08:24:52.004128 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hmc76" Jan 28 08:24:52 crc kubenswrapper[4776]: I0128 08:24:52.065796 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hmc76"] Jan 28 08:24:53 crc kubenswrapper[4776]: I0128 08:24:53.960756 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hmc76" podUID="0896ae57-06f4-4a58-8b46-d2853d5fae75" containerName="registry-server" containerID="cri-o://e4fadcd082f81845b91a1111585cff1487716e2b306615113092412d1d949249" gracePeriod=2 Jan 28 08:24:54 crc kubenswrapper[4776]: I0128 08:24:54.496280 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hmc76" Jan 28 08:24:54 crc kubenswrapper[4776]: I0128 08:24:54.619250 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0896ae57-06f4-4a58-8b46-d2853d5fae75-utilities\") pod \"0896ae57-06f4-4a58-8b46-d2853d5fae75\" (UID: \"0896ae57-06f4-4a58-8b46-d2853d5fae75\") " Jan 28 08:24:54 crc kubenswrapper[4776]: I0128 08:24:54.619740 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0896ae57-06f4-4a58-8b46-d2853d5fae75-catalog-content\") pod \"0896ae57-06f4-4a58-8b46-d2853d5fae75\" (UID: \"0896ae57-06f4-4a58-8b46-d2853d5fae75\") " Jan 28 08:24:54 crc kubenswrapper[4776]: I0128 08:24:54.619887 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpgtz\" (UniqueName: \"kubernetes.io/projected/0896ae57-06f4-4a58-8b46-d2853d5fae75-kube-api-access-bpgtz\") pod \"0896ae57-06f4-4a58-8b46-d2853d5fae75\" (UID: \"0896ae57-06f4-4a58-8b46-d2853d5fae75\") " Jan 28 08:24:54 crc kubenswrapper[4776]: I0128 08:24:54.620186 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0896ae57-06f4-4a58-8b46-d2853d5fae75-utilities" (OuterVolumeSpecName: "utilities") pod "0896ae57-06f4-4a58-8b46-d2853d5fae75" (UID: "0896ae57-06f4-4a58-8b46-d2853d5fae75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:24:54 crc kubenswrapper[4776]: I0128 08:24:54.620517 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0896ae57-06f4-4a58-8b46-d2853d5fae75-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 08:24:54 crc kubenswrapper[4776]: I0128 08:24:54.625795 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0896ae57-06f4-4a58-8b46-d2853d5fae75-kube-api-access-bpgtz" (OuterVolumeSpecName: "kube-api-access-bpgtz") pod "0896ae57-06f4-4a58-8b46-d2853d5fae75" (UID: "0896ae57-06f4-4a58-8b46-d2853d5fae75"). InnerVolumeSpecName "kube-api-access-bpgtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:24:54 crc kubenswrapper[4776]: I0128 08:24:54.644985 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0896ae57-06f4-4a58-8b46-d2853d5fae75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0896ae57-06f4-4a58-8b46-d2853d5fae75" (UID: "0896ae57-06f4-4a58-8b46-d2853d5fae75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:24:54 crc kubenswrapper[4776]: I0128 08:24:54.722514 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpgtz\" (UniqueName: \"kubernetes.io/projected/0896ae57-06f4-4a58-8b46-d2853d5fae75-kube-api-access-bpgtz\") on node \"crc\" DevicePath \"\"" Jan 28 08:24:54 crc kubenswrapper[4776]: I0128 08:24:54.722787 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0896ae57-06f4-4a58-8b46-d2853d5fae75-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 08:24:54 crc kubenswrapper[4776]: I0128 08:24:54.978365 4776 generic.go:334] "Generic (PLEG): container finished" podID="0896ae57-06f4-4a58-8b46-d2853d5fae75" containerID="e4fadcd082f81845b91a1111585cff1487716e2b306615113092412d1d949249" exitCode=0 Jan 28 08:24:54 crc kubenswrapper[4776]: I0128 08:24:54.978417 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hmc76" event={"ID":"0896ae57-06f4-4a58-8b46-d2853d5fae75","Type":"ContainerDied","Data":"e4fadcd082f81845b91a1111585cff1487716e2b306615113092412d1d949249"} Jan 28 08:24:54 crc kubenswrapper[4776]: I0128 08:24:54.978450 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hmc76" Jan 28 08:24:54 crc kubenswrapper[4776]: I0128 08:24:54.978469 4776 scope.go:117] "RemoveContainer" containerID="e4fadcd082f81845b91a1111585cff1487716e2b306615113092412d1d949249" Jan 28 08:24:54 crc kubenswrapper[4776]: I0128 08:24:54.978458 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hmc76" event={"ID":"0896ae57-06f4-4a58-8b46-d2853d5fae75","Type":"ContainerDied","Data":"f4e6f579f35b4d0fd386f587c95aa8115dad6a31e2c8820c4470de92e4ef7ab7"} Jan 28 08:24:55 crc kubenswrapper[4776]: I0128 08:24:55.016692 4776 scope.go:117] "RemoveContainer" containerID="b44db00d42a077e1effb0bca635a8c82084fb9a62ef98c375d1cffed933eba1e" Jan 28 08:24:55 crc kubenswrapper[4776]: I0128 08:24:55.016879 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hmc76"] Jan 28 08:24:55 crc kubenswrapper[4776]: I0128 08:24:55.026392 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hmc76"] Jan 28 08:24:55 crc kubenswrapper[4776]: I0128 08:24:55.035250 4776 scope.go:117] "RemoveContainer" containerID="5235f8f8f9bfa2e04611cfd67cc9a1859b07f23302e575a341bbdaed7a5d4d23" Jan 28 08:24:55 crc kubenswrapper[4776]: I0128 08:24:55.088860 4776 scope.go:117] "RemoveContainer" containerID="e4fadcd082f81845b91a1111585cff1487716e2b306615113092412d1d949249" Jan 28 08:24:55 crc kubenswrapper[4776]: E0128 08:24:55.089352 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4fadcd082f81845b91a1111585cff1487716e2b306615113092412d1d949249\": container with ID starting with e4fadcd082f81845b91a1111585cff1487716e2b306615113092412d1d949249 not found: ID does not exist" containerID="e4fadcd082f81845b91a1111585cff1487716e2b306615113092412d1d949249" Jan 28 08:24:55 crc kubenswrapper[4776]: I0128 08:24:55.089399 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fadcd082f81845b91a1111585cff1487716e2b306615113092412d1d949249"} err="failed to get container status \"e4fadcd082f81845b91a1111585cff1487716e2b306615113092412d1d949249\": rpc error: code = NotFound desc = could not find container \"e4fadcd082f81845b91a1111585cff1487716e2b306615113092412d1d949249\": container with ID starting with e4fadcd082f81845b91a1111585cff1487716e2b306615113092412d1d949249 not found: ID does not exist" Jan 28 08:24:55 crc kubenswrapper[4776]: I0128 08:24:55.089432 4776 scope.go:117] "RemoveContainer" containerID="b44db00d42a077e1effb0bca635a8c82084fb9a62ef98c375d1cffed933eba1e" Jan 28 08:24:55 crc kubenswrapper[4776]: E0128 08:24:55.089948 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b44db00d42a077e1effb0bca635a8c82084fb9a62ef98c375d1cffed933eba1e\": container with ID starting with b44db00d42a077e1effb0bca635a8c82084fb9a62ef98c375d1cffed933eba1e not found: ID does not exist" containerID="b44db00d42a077e1effb0bca635a8c82084fb9a62ef98c375d1cffed933eba1e" Jan 28 08:24:55 crc kubenswrapper[4776]: I0128 08:24:55.089978 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44db00d42a077e1effb0bca635a8c82084fb9a62ef98c375d1cffed933eba1e"} err="failed to get container status \"b44db00d42a077e1effb0bca635a8c82084fb9a62ef98c375d1cffed933eba1e\": rpc error: code = NotFound desc = could not find container \"b44db00d42a077e1effb0bca635a8c82084fb9a62ef98c375d1cffed933eba1e\": container with ID starting with b44db00d42a077e1effb0bca635a8c82084fb9a62ef98c375d1cffed933eba1e not found: ID does not exist" Jan 28 08:24:55 crc kubenswrapper[4776]: I0128 08:24:55.090000 4776 scope.go:117] "RemoveContainer" containerID="5235f8f8f9bfa2e04611cfd67cc9a1859b07f23302e575a341bbdaed7a5d4d23" Jan 28 08:24:55 crc kubenswrapper[4776]: E0128 08:24:55.090279 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5235f8f8f9bfa2e04611cfd67cc9a1859b07f23302e575a341bbdaed7a5d4d23\": container with ID starting with 5235f8f8f9bfa2e04611cfd67cc9a1859b07f23302e575a341bbdaed7a5d4d23 not found: ID does not exist" containerID="5235f8f8f9bfa2e04611cfd67cc9a1859b07f23302e575a341bbdaed7a5d4d23" Jan 28 08:24:55 crc kubenswrapper[4776]: I0128 08:24:55.090305 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5235f8f8f9bfa2e04611cfd67cc9a1859b07f23302e575a341bbdaed7a5d4d23"} err="failed to get container status \"5235f8f8f9bfa2e04611cfd67cc9a1859b07f23302e575a341bbdaed7a5d4d23\": rpc error: code = NotFound desc = could not find container \"5235f8f8f9bfa2e04611cfd67cc9a1859b07f23302e575a341bbdaed7a5d4d23\": container with ID starting with 5235f8f8f9bfa2e04611cfd67cc9a1859b07f23302e575a341bbdaed7a5d4d23 not found: ID does not exist" Jan 28 08:24:55 crc kubenswrapper[4776]: I0128 08:24:55.319751 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0896ae57-06f4-4a58-8b46-d2853d5fae75" path="/var/lib/kubelet/pods/0896ae57-06f4-4a58-8b46-d2853d5fae75/volumes" Jan 28 08:25:01 crc kubenswrapper[4776]: I0128 08:25:01.578241 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-9v6mf_633bf947-38aa-4444-911d-ea2f55433a93/prometheus-operator/0.log" Jan 28 08:25:01 crc kubenswrapper[4776]: I0128 08:25:01.699208 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp_f52dbbd1-d020-4074-93eb-706fff6e588b/prometheus-operator-admission-webhook/0.log" Jan 28 08:25:01 crc kubenswrapper[4776]: I0128 08:25:01.738142 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf_194dde85-71e7-4d74-80c4-59e327ac851a/prometheus-operator-admission-webhook/0.log" Jan 28 08:25:01 crc kubenswrapper[4776]: I0128 08:25:01.903845 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-qd6sh_f1812216-a0b3-4ae2-9c2c-7086dc74163b/perses-operator/0.log" Jan 28 08:25:01 crc kubenswrapper[4776]: I0128 08:25:01.931173 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-srqjl_215d3c95-e6d6-4022-a435-f6c30c630727/operator/0.log" Jan 28 08:25:17 crc kubenswrapper[4776]: I0128 08:25:17.203458 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-9qzfc_896d6757-3340-421c-937a-d6e35e752bdc/kube-rbac-proxy/0.log" Jan 28 08:25:17 crc kubenswrapper[4776]: I0128 08:25:17.279613 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-9qzfc_896d6757-3340-421c-937a-d6e35e752bdc/controller/0.log" Jan 28 08:25:17 crc kubenswrapper[4776]: I0128 08:25:17.414296 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-frr-files/0.log" Jan 28 08:25:17 crc kubenswrapper[4776]: I0128 08:25:17.578448 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-frr-files/0.log" Jan 28 08:25:17 crc kubenswrapper[4776]: I0128 08:25:17.598127 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-reloader/0.log" Jan 28 08:25:17 crc kubenswrapper[4776]: I0128 08:25:17.603256 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-reloader/0.log" Jan 28 08:25:17 crc kubenswrapper[4776]: I0128 08:25:17.603631 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-metrics/0.log" Jan 28 08:25:17 crc kubenswrapper[4776]: I0128 08:25:17.778603 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-frr-files/0.log" Jan 28 08:25:17 crc kubenswrapper[4776]: I0128 08:25:17.803675 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-metrics/0.log" Jan 28 08:25:17 crc kubenswrapper[4776]: I0128 08:25:17.819652 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-reloader/0.log" Jan 28 08:25:17 crc kubenswrapper[4776]: I0128 08:25:17.855156 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-metrics/0.log" Jan 28 08:25:17 crc kubenswrapper[4776]: I0128 08:25:17.977671 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-frr-files/0.log" Jan 28 08:25:17 crc kubenswrapper[4776]: I0128 08:25:17.991605 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-metrics/0.log" Jan 28 08:25:17 crc kubenswrapper[4776]: I0128 08:25:17.995496 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/cp-reloader/0.log" Jan 28 08:25:18 crc kubenswrapper[4776]: I0128 08:25:18.023398 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/controller/0.log" Jan 28 08:25:18 crc kubenswrapper[4776]: I0128 08:25:18.152796 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/frr-metrics/0.log" Jan 28 08:25:18 crc kubenswrapper[4776]: I0128 08:25:18.203785 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/kube-rbac-proxy/0.log" Jan 28 08:25:18 crc kubenswrapper[4776]: I0128 08:25:18.286799 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/kube-rbac-proxy-frr/0.log" Jan 28 08:25:18 crc kubenswrapper[4776]: I0128 08:25:18.371917 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/reloader/0.log" Jan 28 08:25:18 crc kubenswrapper[4776]: I0128 08:25:18.841367 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-mbw9s_a349654d-030c-4341-b884-8f295ea9dfa9/frr-k8s-webhook-server/0.log" Jan 28 08:25:19 crc kubenswrapper[4776]: I0128 08:25:19.084094 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-74f4f84-5s97b_b127309b-519f-42d4-9aca-30708ae2aae1/manager/0.log" Jan 28 08:25:19 crc kubenswrapper[4776]: I0128 08:25:19.215828 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-f68d4f57-8pgd6_10b029bb-8821-4602-9b1d-910d59efc97a/webhook-server/0.log" Jan 28 08:25:19 crc kubenswrapper[4776]: I0128 08:25:19.416653 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mjlbx_6f2ff038-c715-4cff-a872-ac6ae5c7fbff/kube-rbac-proxy/0.log" Jan 28 08:25:19 crc kubenswrapper[4776]: I0128 08:25:19.860575 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mjlbx_6f2ff038-c715-4cff-a872-ac6ae5c7fbff/speaker/0.log" Jan 28 08:25:19 crc kubenswrapper[4776]: I0128 08:25:19.868670 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4bxr_f824bf65-570f-4d47-8006-8e13fb86368f/frr/0.log" Jan 28 08:25:34 crc kubenswrapper[4776]: I0128 08:25:34.759788 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9_212e186f-3642-483d-adf6-00dfaf77ca5f/util/0.log" Jan 28 08:25:34 crc kubenswrapper[4776]: I0128 08:25:34.935319 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9_212e186f-3642-483d-adf6-00dfaf77ca5f/util/0.log" Jan 28 08:25:34 crc kubenswrapper[4776]: I0128 08:25:34.936745 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9_212e186f-3642-483d-adf6-00dfaf77ca5f/pull/0.log" Jan 28 08:25:34 crc kubenswrapper[4776]: I0128 08:25:34.960451 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9_212e186f-3642-483d-adf6-00dfaf77ca5f/pull/0.log" Jan 28 08:25:35 crc kubenswrapper[4776]: I0128 08:25:35.127324 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9_212e186f-3642-483d-adf6-00dfaf77ca5f/util/0.log" Jan 28 08:25:35 crc kubenswrapper[4776]: I0128 08:25:35.162763 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9_212e186f-3642-483d-adf6-00dfaf77ca5f/pull/0.log" Jan 28 08:25:35 crc kubenswrapper[4776]: I0128 08:25:35.177882 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcggrt9_212e186f-3642-483d-adf6-00dfaf77ca5f/extract/0.log" Jan 28 08:25:35 crc kubenswrapper[4776]: I0128 08:25:35.287018 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj_8d1bba84-5283-4516-94aa-2b7fa90c5e6d/util/0.log" Jan 28 08:25:35 crc kubenswrapper[4776]: I0128 08:25:35.471688 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj_8d1bba84-5283-4516-94aa-2b7fa90c5e6d/pull/0.log" Jan 28 08:25:35 crc kubenswrapper[4776]: I0128 08:25:35.519619 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj_8d1bba84-5283-4516-94aa-2b7fa90c5e6d/pull/0.log" Jan 28 08:25:35 crc kubenswrapper[4776]: I0128 08:25:35.520577 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj_8d1bba84-5283-4516-94aa-2b7fa90c5e6d/util/0.log" Jan 28 08:25:35 crc kubenswrapper[4776]: I0128 08:25:35.667601 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj_8d1bba84-5283-4516-94aa-2b7fa90c5e6d/util/0.log" Jan 28 08:25:35 crc kubenswrapper[4776]: I0128 08:25:35.679088 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj_8d1bba84-5283-4516-94aa-2b7fa90c5e6d/pull/0.log" Jan 28 08:25:35 crc kubenswrapper[4776]: I0128 08:25:35.697588 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71364ltj_8d1bba84-5283-4516-94aa-2b7fa90c5e6d/extract/0.log" Jan 28 08:25:35 crc kubenswrapper[4776]: I0128 08:25:35.821762 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv_4d23dd47-b538-454a-873c-b3cc6b26c92b/util/0.log" Jan 28 08:25:36 crc kubenswrapper[4776]: I0128 08:25:36.002721 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv_4d23dd47-b538-454a-873c-b3cc6b26c92b/util/0.log" Jan 28 08:25:36 crc kubenswrapper[4776]: I0128 08:25:36.004787 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv_4d23dd47-b538-454a-873c-b3cc6b26c92b/pull/0.log" Jan 28 08:25:36 crc kubenswrapper[4776]: I0128 08:25:36.011150 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv_4d23dd47-b538-454a-873c-b3cc6b26c92b/pull/0.log" Jan 28 08:25:36 crc kubenswrapper[4776]: I0128 08:25:36.205904 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv_4d23dd47-b538-454a-873c-b3cc6b26c92b/pull/0.log" Jan 28 08:25:36 crc kubenswrapper[4776]: I0128 08:25:36.206478 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv_4d23dd47-b538-454a-873c-b3cc6b26c92b/util/0.log" Jan 28 08:25:36 crc kubenswrapper[4776]: I0128 08:25:36.211454 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08jb5nv_4d23dd47-b538-454a-873c-b3cc6b26c92b/extract/0.log" Jan 28 08:25:36 crc kubenswrapper[4776]: I0128 08:25:36.365289 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bfs7z_d6bbc905-926a-485a-a0da-4ac35f39505b/extract-utilities/0.log" Jan 28 08:25:36 crc kubenswrapper[4776]: I0128 08:25:36.582234 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bfs7z_d6bbc905-926a-485a-a0da-4ac35f39505b/extract-content/0.log" Jan 28 08:25:36 crc kubenswrapper[4776]: I0128 08:25:36.582580 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bfs7z_d6bbc905-926a-485a-a0da-4ac35f39505b/extract-utilities/0.log" Jan 28 08:25:36 crc kubenswrapper[4776]: I0128 08:25:36.584942 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bfs7z_d6bbc905-926a-485a-a0da-4ac35f39505b/extract-content/0.log" Jan 28 08:25:36 crc kubenswrapper[4776]: I0128 08:25:36.747585 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bfs7z_d6bbc905-926a-485a-a0da-4ac35f39505b/extract-utilities/0.log" Jan 28 08:25:36 crc kubenswrapper[4776]: I0128 08:25:36.765046 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bfs7z_d6bbc905-926a-485a-a0da-4ac35f39505b/extract-content/0.log" Jan 28 08:25:36 crc kubenswrapper[4776]: I0128 08:25:36.967401 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-82xql_abb32b75-3393-47dc-a543-bfa5745c4ec6/extract-utilities/0.log" Jan 28 08:25:37 crc kubenswrapper[4776]: I0128 08:25:37.001867 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bfs7z_d6bbc905-926a-485a-a0da-4ac35f39505b/registry-server/0.log" Jan 28 08:25:37 crc kubenswrapper[4776]: I0128 08:25:37.165851 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-82xql_abb32b75-3393-47dc-a543-bfa5745c4ec6/extract-content/0.log" Jan 28 08:25:37 crc kubenswrapper[4776]: I0128 08:25:37.188296 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-82xql_abb32b75-3393-47dc-a543-bfa5745c4ec6/extract-content/0.log" Jan 28 08:25:37 crc kubenswrapper[4776]: I0128 08:25:37.213999 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-82xql_abb32b75-3393-47dc-a543-bfa5745c4ec6/extract-utilities/0.log" Jan 28 08:25:37 crc kubenswrapper[4776]: I0128 08:25:37.436163 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-82xql_abb32b75-3393-47dc-a543-bfa5745c4ec6/extract-content/0.log" Jan 28 08:25:37 crc kubenswrapper[4776]: I0128 08:25:37.446759 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-82xql_abb32b75-3393-47dc-a543-bfa5745c4ec6/extract-utilities/0.log" Jan 28 08:25:37 crc kubenswrapper[4776]: I0128 08:25:37.628159 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-45lfz_3d2605e3-4b9a-4dc8-8936-b209875dbdee/marketplace-operator/0.log" Jan 28 08:25:37 crc kubenswrapper[4776]: I0128 08:25:37.710002 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6fkfh_4972ca48-b9b2-4811-9d6a-15aef7b4a2c1/extract-utilities/0.log" Jan 28 08:25:37 crc kubenswrapper[4776]: I0128 08:25:37.778880 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-82xql_abb32b75-3393-47dc-a543-bfa5745c4ec6/registry-server/0.log" Jan 28 08:25:37 crc kubenswrapper[4776]: I0128 08:25:37.883513 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6fkfh_4972ca48-b9b2-4811-9d6a-15aef7b4a2c1/extract-utilities/0.log" Jan 28 08:25:37 crc kubenswrapper[4776]: I0128 08:25:37.884184 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6fkfh_4972ca48-b9b2-4811-9d6a-15aef7b4a2c1/extract-content/0.log" Jan 28 08:25:37 crc kubenswrapper[4776]: I0128 08:25:37.921921 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6fkfh_4972ca48-b9b2-4811-9d6a-15aef7b4a2c1/extract-content/0.log" Jan 28 08:25:38 crc kubenswrapper[4776]: I0128 08:25:38.065383 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6fkfh_4972ca48-b9b2-4811-9d6a-15aef7b4a2c1/extract-content/0.log" Jan 28 08:25:38 crc kubenswrapper[4776]: I0128 08:25:38.067329 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6fkfh_4972ca48-b9b2-4811-9d6a-15aef7b4a2c1/extract-utilities/0.log" Jan 28 08:25:38 crc kubenswrapper[4776]: I0128 08:25:38.138818 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ft6p9_ae5aaed6-76ba-4b87-aafc-a96a98df7b3c/extract-utilities/0.log" Jan 28 08:25:38 crc kubenswrapper[4776]: I0128 08:25:38.279813 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6fkfh_4972ca48-b9b2-4811-9d6a-15aef7b4a2c1/registry-server/0.log" Jan 28 08:25:38 crc kubenswrapper[4776]: I0128 08:25:38.345285 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ft6p9_ae5aaed6-76ba-4b87-aafc-a96a98df7b3c/extract-content/0.log" Jan 28 08:25:38 crc kubenswrapper[4776]: I0128 08:25:38.353947 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ft6p9_ae5aaed6-76ba-4b87-aafc-a96a98df7b3c/extract-content/0.log" Jan 28 08:25:38 crc kubenswrapper[4776]: I0128 08:25:38.378675 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ft6p9_ae5aaed6-76ba-4b87-aafc-a96a98df7b3c/extract-utilities/0.log" Jan 28 08:25:38 crc kubenswrapper[4776]: I0128 08:25:38.561387 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ft6p9_ae5aaed6-76ba-4b87-aafc-a96a98df7b3c/extract-content/0.log" Jan 28 08:25:38 crc kubenswrapper[4776]: I0128 08:25:38.566294 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ft6p9_ae5aaed6-76ba-4b87-aafc-a96a98df7b3c/extract-utilities/0.log" Jan 28 08:25:39 crc kubenswrapper[4776]: I0128 08:25:39.239156 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ft6p9_ae5aaed6-76ba-4b87-aafc-a96a98df7b3c/registry-server/0.log" Jan 28 08:25:55 crc kubenswrapper[4776]: I0128 08:25:55.849963 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-9v6mf_633bf947-38aa-4444-911d-ea2f55433a93/prometheus-operator/0.log" Jan 28 08:25:56 crc kubenswrapper[4776]: I0128 08:25:56.024841 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8658c4ddf5-pt6lf_194dde85-71e7-4d74-80c4-59e327ac851a/prometheus-operator-admission-webhook/0.log" Jan 28 08:25:56 crc kubenswrapper[4776]: I0128 08:25:56.056674 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-srqjl_215d3c95-e6d6-4022-a435-f6c30c630727/operator/0.log" Jan 28 08:25:56 crc kubenswrapper[4776]: I0128 08:25:56.058080 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8658c4ddf5-54ttp_f52dbbd1-d020-4074-93eb-706fff6e588b/prometheus-operator-admission-webhook/0.log" Jan 28 08:25:56 crc kubenswrapper[4776]: I0128 08:25:56.077877 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-qd6sh_f1812216-a0b3-4ae2-9c2c-7086dc74163b/perses-operator/0.log" Jan 28 08:26:03 crc kubenswrapper[4776]: I0128 08:26:03.852685 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:26:03 crc kubenswrapper[4776]: I0128 08:26:03.853136 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:26:33 crc kubenswrapper[4776]: I0128 08:26:33.852686 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:26:33 crc kubenswrapper[4776]: I0128 08:26:33.853300 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:27:03 crc kubenswrapper[4776]: I0128 08:27:03.851708 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:27:03 crc kubenswrapper[4776]: I0128 08:27:03.852397 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:27:03 crc kubenswrapper[4776]: I0128 08:27:03.852488 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 08:27:03 crc kubenswrapper[4776]: I0128 08:27:03.853813 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2b92e7ab2ce3f30db74b1079e7ac76f906a1d06093b9796e9ab6c35e1696edc"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 08:27:03 crc kubenswrapper[4776]: I0128 08:27:03.853940 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://c2b92e7ab2ce3f30db74b1079e7ac76f906a1d06093b9796e9ab6c35e1696edc" gracePeriod=600 Jan 28 08:27:04 crc kubenswrapper[4776]: I0128 08:27:04.437655 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="c2b92e7ab2ce3f30db74b1079e7ac76f906a1d06093b9796e9ab6c35e1696edc" exitCode=0 Jan 28 08:27:04 crc kubenswrapper[4776]: I0128 08:27:04.437696 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"c2b92e7ab2ce3f30db74b1079e7ac76f906a1d06093b9796e9ab6c35e1696edc"} Jan 28 08:27:04 crc kubenswrapper[4776]: I0128 08:27:04.437990 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerStarted","Data":"8e877cef735dd81ee889a991a3852dca7c9b2be43440d24baf1ab12512de54c6"} Jan 28 08:27:04 crc kubenswrapper[4776]: I0128 08:27:04.438020 4776 scope.go:117] "RemoveContainer" containerID="7dc613bfcf78ec47e793643917d32cad5f7df1acb3c6b03adab053395b3288a1" Jan 28 08:27:52 crc kubenswrapper[4776]: I0128 08:27:52.021128 4776 generic.go:334] "Generic (PLEG): container finished" podID="6b68d949-d849-4a55-a73a-295dac526f50" containerID="15f5303e3119a83c3d42403b41ffe77ee425fae414a5879664d5edddb7c6c26d" exitCode=0 Jan 28 08:27:52 crc kubenswrapper[4776]: I0128 08:27:52.021276 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zk778/must-gather-v5bhp" event={"ID":"6b68d949-d849-4a55-a73a-295dac526f50","Type":"ContainerDied","Data":"15f5303e3119a83c3d42403b41ffe77ee425fae414a5879664d5edddb7c6c26d"} Jan 28 08:27:52 crc kubenswrapper[4776]: I0128 08:27:52.022444 4776 scope.go:117] "RemoveContainer" containerID="15f5303e3119a83c3d42403b41ffe77ee425fae414a5879664d5edddb7c6c26d" Jan 28 08:27:52 crc kubenswrapper[4776]: I0128 08:27:52.356926 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zk778_must-gather-v5bhp_6b68d949-d849-4a55-a73a-295dac526f50/gather/0.log" Jan 28 08:28:04 crc kubenswrapper[4776]: I0128 08:28:04.353667 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zk778/must-gather-v5bhp"] Jan 28 08:28:04 crc kubenswrapper[4776]: I0128 08:28:04.354419 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-zk778/must-gather-v5bhp" podUID="6b68d949-d849-4a55-a73a-295dac526f50" containerName="copy" containerID="cri-o://cde702b39460db9671b040946db74074d9281a9c42bc521b2efb43b9feb6cc30" gracePeriod=2 Jan 28 08:28:04 crc kubenswrapper[4776]: I0128 08:28:04.368049 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zk778/must-gather-v5bhp"] Jan 28 08:28:04 crc kubenswrapper[4776]: I0128 08:28:04.784591 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zk778_must-gather-v5bhp_6b68d949-d849-4a55-a73a-295dac526f50/copy/0.log" Jan 28 08:28:04 crc kubenswrapper[4776]: I0128 08:28:04.785454 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zk778/must-gather-v5bhp" Jan 28 08:28:05 crc kubenswrapper[4776]: I0128 08:28:05.259611 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4n4t\" (UniqueName: \"kubernetes.io/projected/6b68d949-d849-4a55-a73a-295dac526f50-kube-api-access-r4n4t\") pod \"6b68d949-d849-4a55-a73a-295dac526f50\" (UID: \"6b68d949-d849-4a55-a73a-295dac526f50\") " Jan 28 08:28:05 crc kubenswrapper[4776]: I0128 08:28:05.259873 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b68d949-d849-4a55-a73a-295dac526f50-must-gather-output\") pod \"6b68d949-d849-4a55-a73a-295dac526f50\" (UID: \"6b68d949-d849-4a55-a73a-295dac526f50\") " Jan 28 08:28:05 crc kubenswrapper[4776]: I0128 08:28:05.293335 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zk778_must-gather-v5bhp_6b68d949-d849-4a55-a73a-295dac526f50/copy/0.log" Jan 28 08:28:05 crc kubenswrapper[4776]: I0128 08:28:05.306452 4776 generic.go:334] "Generic (PLEG): container finished" podID="6b68d949-d849-4a55-a73a-295dac526f50" containerID="cde702b39460db9671b040946db74074d9281a9c42bc521b2efb43b9feb6cc30" exitCode=143 Jan 28 08:28:05 crc kubenswrapper[4776]: I0128 08:28:05.306511 4776 scope.go:117] "RemoveContainer" containerID="cde702b39460db9671b040946db74074d9281a9c42bc521b2efb43b9feb6cc30" Jan 28 08:28:05 crc kubenswrapper[4776]: I0128 08:28:05.306823 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zk778/must-gather-v5bhp" Jan 28 08:28:05 crc kubenswrapper[4776]: I0128 08:28:05.346874 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b68d949-d849-4a55-a73a-295dac526f50-kube-api-access-r4n4t" (OuterVolumeSpecName: "kube-api-access-r4n4t") pod "6b68d949-d849-4a55-a73a-295dac526f50" (UID: "6b68d949-d849-4a55-a73a-295dac526f50"). InnerVolumeSpecName "kube-api-access-r4n4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:28:05 crc kubenswrapper[4776]: I0128 08:28:05.364980 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4n4t\" (UniqueName: \"kubernetes.io/projected/6b68d949-d849-4a55-a73a-295dac526f50-kube-api-access-r4n4t\") on node \"crc\" DevicePath \"\"" Jan 28 08:28:05 crc kubenswrapper[4776]: I0128 08:28:05.406918 4776 scope.go:117] "RemoveContainer" containerID="15f5303e3119a83c3d42403b41ffe77ee425fae414a5879664d5edddb7c6c26d" Jan 28 08:28:05 crc kubenswrapper[4776]: I0128 08:28:05.531785 4776 scope.go:117] "RemoveContainer" containerID="cde702b39460db9671b040946db74074d9281a9c42bc521b2efb43b9feb6cc30" Jan 28 08:28:05 crc kubenswrapper[4776]: E0128 08:28:05.532657 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde702b39460db9671b040946db74074d9281a9c42bc521b2efb43b9feb6cc30\": container with ID starting with cde702b39460db9671b040946db74074d9281a9c42bc521b2efb43b9feb6cc30 not found: ID does not exist" containerID="cde702b39460db9671b040946db74074d9281a9c42bc521b2efb43b9feb6cc30" Jan 28 08:28:05 crc kubenswrapper[4776]: I0128 08:28:05.532702 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde702b39460db9671b040946db74074d9281a9c42bc521b2efb43b9feb6cc30"} err="failed to get container status \"cde702b39460db9671b040946db74074d9281a9c42bc521b2efb43b9feb6cc30\": rpc error: code = NotFound desc = could not find container \"cde702b39460db9671b040946db74074d9281a9c42bc521b2efb43b9feb6cc30\": container with ID starting with cde702b39460db9671b040946db74074d9281a9c42bc521b2efb43b9feb6cc30 not found: ID does not exist" Jan 28 08:28:05 crc kubenswrapper[4776]: I0128 08:28:05.532731 4776 scope.go:117] "RemoveContainer" containerID="15f5303e3119a83c3d42403b41ffe77ee425fae414a5879664d5edddb7c6c26d" Jan 28 08:28:05 crc kubenswrapper[4776]: E0128 08:28:05.532998 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15f5303e3119a83c3d42403b41ffe77ee425fae414a5879664d5edddb7c6c26d\": container with ID starting with 15f5303e3119a83c3d42403b41ffe77ee425fae414a5879664d5edddb7c6c26d not found: ID does not exist" containerID="15f5303e3119a83c3d42403b41ffe77ee425fae414a5879664d5edddb7c6c26d" Jan 28 08:28:05 crc kubenswrapper[4776]: I0128 08:28:05.533018 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15f5303e3119a83c3d42403b41ffe77ee425fae414a5879664d5edddb7c6c26d"} err="failed to get container status \"15f5303e3119a83c3d42403b41ffe77ee425fae414a5879664d5edddb7c6c26d\": rpc error: code = NotFound desc = could not find container \"15f5303e3119a83c3d42403b41ffe77ee425fae414a5879664d5edddb7c6c26d\": container with ID starting with 15f5303e3119a83c3d42403b41ffe77ee425fae414a5879664d5edddb7c6c26d not found: ID does not exist" Jan 28 08:28:05 crc kubenswrapper[4776]: I0128 08:28:05.545742 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b68d949-d849-4a55-a73a-295dac526f50-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6b68d949-d849-4a55-a73a-295dac526f50" (UID: "6b68d949-d849-4a55-a73a-295dac526f50"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 08:28:05 crc kubenswrapper[4776]: I0128 08:28:05.571529 4776 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b68d949-d849-4a55-a73a-295dac526f50-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 28 08:28:07 crc kubenswrapper[4776]: I0128 08:28:07.317233 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b68d949-d849-4a55-a73a-295dac526f50" path="/var/lib/kubelet/pods/6b68d949-d849-4a55-a73a-295dac526f50/volumes" Jan 28 08:29:33 crc kubenswrapper[4776]: I0128 08:29:33.852797 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:29:33 crc kubenswrapper[4776]: I0128 08:29:33.853458 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.164073 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w"] Jan 28 08:30:00 crc kubenswrapper[4776]: E0128 08:30:00.165422 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0896ae57-06f4-4a58-8b46-d2853d5fae75" containerName="registry-server" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.165446 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0896ae57-06f4-4a58-8b46-d2853d5fae75" containerName="registry-server" Jan 28 08:30:00 crc kubenswrapper[4776]: E0128 08:30:00.165479 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0896ae57-06f4-4a58-8b46-d2853d5fae75" containerName="extract-utilities" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.165492 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0896ae57-06f4-4a58-8b46-d2853d5fae75" containerName="extract-utilities" Jan 28 08:30:00 crc kubenswrapper[4776]: E0128 08:30:00.165527 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b68d949-d849-4a55-a73a-295dac526f50" containerName="copy" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.165542 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b68d949-d849-4a55-a73a-295dac526f50" containerName="copy" Jan 28 08:30:00 crc kubenswrapper[4776]: E0128 08:30:00.165616 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b68d949-d849-4a55-a73a-295dac526f50" containerName="gather" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.165629 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b68d949-d849-4a55-a73a-295dac526f50" containerName="gather" Jan 28 08:30:00 crc kubenswrapper[4776]: E0128 08:30:00.165668 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0896ae57-06f4-4a58-8b46-d2853d5fae75" containerName="extract-content" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.165681 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0896ae57-06f4-4a58-8b46-d2853d5fae75" containerName="extract-content" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.166055 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b68d949-d849-4a55-a73a-295dac526f50" containerName="gather" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.166090 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b68d949-d849-4a55-a73a-295dac526f50" containerName="copy" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.166121 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0896ae57-06f4-4a58-8b46-d2853d5fae75" containerName="registry-server" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.167325 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.169925 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.170635 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.188000 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w"] Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.275754 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-secret-volume\") pod \"collect-profiles-29493150-p766w\" (UID: \"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.276195 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9tw9\" (UniqueName: \"kubernetes.io/projected/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-kube-api-access-k9tw9\") pod \"collect-profiles-29493150-p766w\" (UID: \"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.276247 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-config-volume\") pod \"collect-profiles-29493150-p766w\" (UID: \"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.378067 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9tw9\" (UniqueName: \"kubernetes.io/projected/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-kube-api-access-k9tw9\") pod \"collect-profiles-29493150-p766w\" (UID: \"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.378480 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-config-volume\") pod \"collect-profiles-29493150-p766w\" (UID: \"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.378800 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-secret-volume\") pod \"collect-profiles-29493150-p766w\" (UID: \"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.379892 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-config-volume\") pod \"collect-profiles-29493150-p766w\" (UID: \"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.389098 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-secret-volume\") pod \"collect-profiles-29493150-p766w\" (UID: \"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.398163 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9tw9\" (UniqueName: \"kubernetes.io/projected/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-kube-api-access-k9tw9\") pod \"collect-profiles-29493150-p766w\" (UID: \"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w" Jan 28 08:30:00 crc kubenswrapper[4776]: I0128 08:30:00.502579 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w" Jan 28 08:30:01 crc kubenswrapper[4776]: I0128 08:30:01.045986 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w"] Jan 28 08:30:01 crc kubenswrapper[4776]: I0128 08:30:01.722721 4776 generic.go:334] "Generic (PLEG): container finished" podID="4f3b3179-2fb9-4e9b-8da9-e2465d7506d0" containerID="09c3cb6e034348b183126604be9afdc395012f90a31d41cd3b25ccbbb55dd3ec" exitCode=0 Jan 28 08:30:01 crc kubenswrapper[4776]: I0128 08:30:01.723058 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w" event={"ID":"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0","Type":"ContainerDied","Data":"09c3cb6e034348b183126604be9afdc395012f90a31d41cd3b25ccbbb55dd3ec"} Jan 28 08:30:01 crc kubenswrapper[4776]: I0128 08:30:01.723122 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w" event={"ID":"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0","Type":"ContainerStarted","Data":"632cc5d9c433096da9e895763b8b59f39ebb012294f3f58fdcfd7788aea38530"} Jan 28 08:30:03 crc kubenswrapper[4776]: I0128 08:30:03.089731 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w" Jan 28 08:30:03 crc kubenswrapper[4776]: I0128 08:30:03.238148 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-secret-volume\") pod \"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0\" (UID: \"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0\") " Jan 28 08:30:03 crc kubenswrapper[4776]: I0128 08:30:03.238219 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9tw9\" (UniqueName: \"kubernetes.io/projected/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-kube-api-access-k9tw9\") pod \"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0\" (UID: \"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0\") " Jan 28 08:30:03 crc kubenswrapper[4776]: I0128 08:30:03.238602 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-config-volume\") pod \"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0\" (UID: \"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0\") " Jan 28 08:30:03 crc kubenswrapper[4776]: I0128 08:30:03.239081 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-config-volume" (OuterVolumeSpecName: "config-volume") pod "4f3b3179-2fb9-4e9b-8da9-e2465d7506d0" (UID: "4f3b3179-2fb9-4e9b-8da9-e2465d7506d0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 08:30:03 crc kubenswrapper[4776]: I0128 08:30:03.239848 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 08:30:03 crc kubenswrapper[4776]: I0128 08:30:03.245694 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-kube-api-access-k9tw9" (OuterVolumeSpecName: "kube-api-access-k9tw9") pod "4f3b3179-2fb9-4e9b-8da9-e2465d7506d0" (UID: "4f3b3179-2fb9-4e9b-8da9-e2465d7506d0"). InnerVolumeSpecName "kube-api-access-k9tw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 08:30:03 crc kubenswrapper[4776]: I0128 08:30:03.245832 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4f3b3179-2fb9-4e9b-8da9-e2465d7506d0" (UID: "4f3b3179-2fb9-4e9b-8da9-e2465d7506d0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 08:30:03 crc kubenswrapper[4776]: I0128 08:30:03.342152 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 08:30:03 crc kubenswrapper[4776]: I0128 08:30:03.342197 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9tw9\" (UniqueName: \"kubernetes.io/projected/4f3b3179-2fb9-4e9b-8da9-e2465d7506d0-kube-api-access-k9tw9\") on node \"crc\" DevicePath \"\"" Jan 28 08:30:03 crc kubenswrapper[4776]: I0128 08:30:03.765913 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w" event={"ID":"4f3b3179-2fb9-4e9b-8da9-e2465d7506d0","Type":"ContainerDied","Data":"632cc5d9c433096da9e895763b8b59f39ebb012294f3f58fdcfd7788aea38530"} Jan 28 08:30:03 crc kubenswrapper[4776]: I0128 08:30:03.766459 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="632cc5d9c433096da9e895763b8b59f39ebb012294f3f58fdcfd7788aea38530" Jan 28 08:30:03 crc kubenswrapper[4776]: I0128 08:30:03.767969 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493150-p766w" Jan 28 08:30:03 crc kubenswrapper[4776]: I0128 08:30:03.852303 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:30:03 crc kubenswrapper[4776]: I0128 08:30:03.852364 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:30:04 crc kubenswrapper[4776]: I0128 08:30:04.180457 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd"] Jan 28 08:30:04 crc kubenswrapper[4776]: I0128 08:30:04.190596 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493105-c69pd"] Jan 28 08:30:05 crc kubenswrapper[4776]: I0128 08:30:05.316390 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1521a499-587e-4bc9-86f4-a572e4e238cd" path="/var/lib/kubelet/pods/1521a499-587e-4bc9-86f4-a572e4e238cd/volumes" Jan 28 08:30:33 crc kubenswrapper[4776]: I0128 08:30:33.852979 4776 patch_prober.go:28] interesting pod/machine-config-daemon-stl56 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 08:30:33 crc kubenswrapper[4776]: I0128 08:30:33.853885 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 08:30:33 crc kubenswrapper[4776]: I0128 08:30:33.853991 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stl56" Jan 28 08:30:33 crc kubenswrapper[4776]: I0128 08:30:33.855117 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8e877cef735dd81ee889a991a3852dca7c9b2be43440d24baf1ab12512de54c6"} pod="openshift-machine-config-operator/machine-config-daemon-stl56" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 08:30:33 crc kubenswrapper[4776]: I0128 08:30:33.855220 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" containerName="machine-config-daemon" containerID="cri-o://8e877cef735dd81ee889a991a3852dca7c9b2be43440d24baf1ab12512de54c6" gracePeriod=600 Jan 28 08:30:34 crc kubenswrapper[4776]: E0128 08:30:34.004962 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067" Jan 28 08:30:34 crc kubenswrapper[4776]: I0128 08:30:34.140005 4776 generic.go:334] "Generic (PLEG): container finished" podID="3539113f-fe53-40a0-a08c-d7f86951d067" containerID="8e877cef735dd81ee889a991a3852dca7c9b2be43440d24baf1ab12512de54c6" exitCode=0 Jan 28 08:30:34 crc kubenswrapper[4776]: I0128 08:30:34.140070 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stl56" event={"ID":"3539113f-fe53-40a0-a08c-d7f86951d067","Type":"ContainerDied","Data":"8e877cef735dd81ee889a991a3852dca7c9b2be43440d24baf1ab12512de54c6"} Jan 28 08:30:34 crc kubenswrapper[4776]: I0128 08:30:34.140120 4776 scope.go:117] "RemoveContainer" containerID="c2b92e7ab2ce3f30db74b1079e7ac76f906a1d06093b9796e9ab6c35e1696edc" Jan 28 08:30:34 crc kubenswrapper[4776]: I0128 08:30:34.140925 4776 scope.go:117] "RemoveContainer" containerID="8e877cef735dd81ee889a991a3852dca7c9b2be43440d24baf1ab12512de54c6" Jan 28 08:30:34 crc kubenswrapper[4776]: E0128 08:30:34.141589 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stl56_openshift-machine-config-operator(3539113f-fe53-40a0-a08c-d7f86951d067)\"" pod="openshift-machine-config-operator/machine-config-daemon-stl56" podUID="3539113f-fe53-40a0-a08c-d7f86951d067"